VIDEO: IT specialist partners with conservation charity to detect illegal cardamom cultivation

VIDEO: IT specialist partners with conservation charity to detect illegal cardamom cultivation

To address the challenge of forest degradation in Mu Cang Chai forest in Vietnam, Aiforgood Asia partnered with Crayon for its expertise in data and AI to develop a Proof-of-Concept computer vision-based solution that utilised remote sensing and Machine Learning to detect illegal cardamom cultivation in satellite imagery. Aiforgood Asia approached Fauna & Flora to explore innovative and cost-effective solutions to track changes in the forest and measure the extent of habitat degradation. Lam Van Hoang, Country Director at Fauna & Flora International, Vietnam, told Intelligent SME.tech more about the project.

Fauna & Flora is an international conservation charity and non-governmental organisation dedicated to protecting the diversity of life on Earth. Fauna & Flora works closely with local conservation partners around the world to save nature.

In Vietnam, Fauna & Flora faced the challenge of habitat degradation due to cardamom crop planting in protected reserves, particularly in the fragile ecosystem of Mu Cang Chai forest, home to highly endangered primates, including the western black gibbon and the Indochinese grey langur.

It needed to track changes in the fragile ecosystem of the forest and also measure the extent of degradation in the gibbon’s habitat caused by cardamom cultivation.

Aiforgood Asia, an international NGO focused on AI and technology for environmental and social governance (ESG) projects, approached Fauna & Flora. By mapping the locations most affected by cardamom, Fauna & Flora, together with its government partners, will be better able to devise and target strategies to reduce the degradation of this most sensitive of habitats.

Aiforgood Asia’s ESG-as-a-Service helps match local NGOs and community leaders with corporations looking to run AI-related ESG projects, leveraging AI and remote sensing technologies to improve health and welfare, reduce inequality, fight climate change, and preserve our oceans and forests.

To address the challenge in Mu Cang Chai forest, Aiforgood Asia partnered with Crayon, an IT service specialist with expertise in data and AI, to develop a Proof-of-Concept (PoC) computer vision-based solution that utilised remote sensing and Machine Learning (ML) to detect cardamom crops in satellite imagery. The PoC was tested in the northern ‘core’ region of the conservation area, with the outcomes evaluated based on feasibility and cost.

The results of the PoC showed that remote sensing and ML could, in principle, effectively detect cardamom crops in satellite imagery, providing that the quality of the satellite image was sufficient. This is the first step towards providing Fauna & Flora with a platform that can be scaled to support their conservation activities in the Mu Cang Chai forest and across Vietnam. The study also revealed that the image resolution necessary for detecting cardamom and acquiring ground-truthing data was an essential factor in the cost of the project.

The team faced a significant challenge when they began their project: how to source the most cost-efficient remote sensing data for their model. Their ideal source was Sentinel-2, a satellite constellation launched by the European Space Agency. Images from Sentinel-2 are free to access and have a regular frequency, providing more opportunities to obtain a cloud-free image.

However, it became clear that cardamom production was not visible in low- and medium-resolution satellite images, including Sentinel-2 and Landsat. To gather the necessary ground truth data for training their model, the Crayon team had to rely on experts to gather GPS locations of cardamom crops during field expeditions.

Fortunately, the Fauna & Flora team was able to provide ground truth data for the project with the help of a grant from the Darwin Initiative. They used geo-referenced and orthomosaiced drone images gathered during their ground truth expeditions to accurately label the cardamom areas. Data was processed in DroneDeploy software, as part of a partnership with Fauna & Flora.

To build the model, the team used a U-Net architecture, a widely used classification architecture. They found that the colour gradient within the cardamom area was smaller than in other forested areas due to the plants’ size and homogeneous canopy.

The model was trained on a 500x500m2 area with approximately 96% pixel within-sample accuracy. The team used high-resolution satellite imagery from WorldView-2, which has a resolution of 0.46 meters, to train their model.

To obtain the data, they purchased small areas from EarthImages to keep costs low. The team also used QGIS software for parts of the exploratory data analysis and for sharpening the multi-spectral bands of the high-resolution image with pan-optical data, which resulted in an approximately 4x higher resolution.

Lam Van Hoang, Country Director at Fauna & Flora International, Vietnam, and Hattie Branson, Technical Specialist at Fauna & Flora International, spoke exclusively to Intelligent SME.tech about the project:

How central is technology for preserving life on earth?

Van Hoang: I consider technology very central in carrying out conservation all over the world, not just Vietnam. It can inform not only the conservation actors, but also the government offices, partners and the local communities. All over the world, technology needs to be at the centre of conservation.

What progress has the project made over the past year and how has the Proof-of-Concept application supported this?

Van Hoang: We have been applying so many different conservation technologies. For example, we have been using a very basic GPS device to record our patrol work and to record data when patrolling. We then turn it into Smart data and now we are using a Smart Connect. That means all of our data from when we are patrolling the site can be sent to our centre much more easily. We have an immediate response to whatever is happening on the ground.
We also recently used a thermal drone to monitor the habitat. We are using the thermal drone to conduct our diversity survey and our primate survey. We use our camera trap to record our data for privacy monitoring. We use the camera in the forest to monitor what is happening in the forest.
We also used the remote sensing to monitor the change in habitat because of local community encroachment to the forestry resources. For example, the cardamom is underneath the forest canopy so we would not be able to use a human on the ground to monitor everything. So, we applied the remote sensing to measure that change in the forest habitat and measure the degradation of the area.

What were the specific challenges inherent in both instigating and sustaining a remote sensing and Machine Learning solution for such a project and how were they overcome?

Branson: There are a number of challenges. The first that we initially came across was accessing good-quality satellite imagery. Over our region it was very cloudy, and we had about 5-10 dates per year where we had usable imagery and they were usually in very small time clusters where you had a good weather period.
Acquiring that good-quality data can also be expensive as you move into purchasing satellite imagery. We overcame that by implementing drone imagery as well into this project and using that to reference and label the cardamom locations in our satellite imagery. This brought about a second challenge as well; using the drone imagery, we then had to implement a labelling solution where we labelled the cardamom. This particular area is quite mountainous and involved very accurate ortho-rectifying to ensure that these images could be accurately overlayed with the satellite imagery and the drone imagery, so that the two cardamom locations lined up. These had to be done manually to ensure it was as accurate as possible.
This is a solution that we are hoping to overcome in phase 2 of the project as we look to introduce a lot more accurate ortho-rectifying processing.
Similarly with developing the algorithm, this was definitely a trial and error process of using different approaches to train the algorithm and the classifications to produce better accuracies. So, looking at cardamom that was bright in the imagery and cardamom that was dark in the imagery and using those two classes to enable an output that classified two different types of cardamom and increasing the accuracy that way as well.
The last challenge we are looking at in phase 2 is deploying this product and enabling implementation across a much wider area than the Proof-of-Concept allowed.

Was there anything that stood out from the project in terms of how technology was used?

Branson: One thing I definitely took from it was how difficult it can be, especially when you have got situations where there is activity beneath the canopy. This is increasingly difficult to monitor, especially when you are working in very thick and dense forests and you are trying to identify something that grows on the ground at a bush level. It was difficult to do even via remote sensing and using staff on the ground. It showed how important it is to continue to develop solutions like this so that we can effectively deal with this and provide solutions that allow informed conservation decision-making.

Click below to share this article

Browse our latest issue

Intelligent SME.tech

View Magazine Archive