A mobile-first smart agriculture app built with React Native and hosted using Expo and will be up for deployment. Soil Scan allows users to scan and analyze soil using their phone's camera. The app uses a ResNet50 convolutional neural network to classify soil texture and a trained ExtraTreeClassifier model to recommend the most suitable fertilizer based on the detected soil texture.
The machine learning logic is powered by a Python backend hosted on Hugging Face Spaces, which exposes an API to handle soil texture classification and fertilizer recommendation tasks. All model inference is handled server-side to keep the app lightweight and efficient.
-
React Native (Expo) – for building cross-platform mobile UI
-
ResNet50 – for soil color classification from images
-
ExtraTreeClassifier – for fertilizer recommendation based on soil classification
-
Python – backend logic and ML model hosting
-
Hugging Face Spaces – lightweight cloud backend for API hosting
Soil Image Classification
The user can upload a photo or take a photo. This results in the following:
- Soil texture
- Description of soil texture
- Properties of the soil
Fertilizer Recommendation
Note: The prediction from the texture classification automatically becomes a selected parameter when you choose to get fertilizer recommendations
The user can input NPK values, temperature, Moisture, and Crop-Type. It results in the following:
- Recommended fertilizer for the given soil properties
- A description of the fertilizer
- npm: 10.9.0
- react-native: 0.78.2
- react: 18.x
To run the SoilScan app locally, follow these steps:
git clone https://github.com/ljiro/SoilScan.git
cd SoilScan# Make sure all dependencies are installed
npm install
# For local testing
npx expo start
# For over the internet testing
npx expo start --tunnel



