Nearly two years ago, MERMAID users told us they needed a simpler way to integrate data from photo quadrat images into their MERMAID workflows. First, we built Easy PQT — a tool to help users import CoralNet data into MERMAID Collect. But we knew scientists needed something even more powerful: a faster, integrated way to analyze photo quadrats directly in MERMAID, and easily access this data alongside other information on fish, bleaching, and benthic transects.
From Idea to Impact: Why We Built This
We had a bold idea: what if MERMAID could offer one shared AI model for all users to classify images? By using MERMAID’s standardized taxonomy, we could provide consistent labels across all MERMAID projects — saving time, reducing friction and data cleaning, and accelerating reporting for users and global reporting. Instead of each user training custom models, what if users could simply drag and drop their images into MERMAID Collect and see the results immediately from a MERMAID AI model?
What started as a vision is now a reality.
Launched on World Oceans Day 2025, we’re excited to announce MERMAID AI: Image Classification (Beta) — developed in partnership with CoralNet, from the University of California San Diego and Scripps Institution of Oceanography. This is the first step in a new era of coral reef monitoring: bringing AI-powered image classification directly into MERMAID, where users can seamlessly combine benthic imagery with data on fish, coral bleaching, habitat complexity, and more.
Now in Beta: Try It, Shape It
This new feature is being released as a Beta — an early version that’s ready for real-world use but still in development. We're inviting MERMAID users to try it out and share feedback to help us improve both the feature and the AI model before the full launch. You might notice a few things still in progress — and that’s exactly the point. By using the beta version, you're helping shape the future of image classification in MERMAID.
How to Use MERMAID AI Today
So what can you do with MERMAID AI: Image Classification currently? Built directly into MERMAID Collect, simply ‘Add a collect record’, choose ‘Benthic Photo Quadrat’, and then click on ‘Classify images with AI (Beta)’. You can also still add classified image data manually. After enabling AI and adding the usual information about the monitoring site, management, observers, and method (e.g., length of transect, quadrat size, number of quadrat images), click on ‘Upload photos’. Then you can drag and drop the photos you want to classify with AI. You can start to verify each classified image (click ‘Review’) while other images queue for AI classification. In each image, we’ve dropped a 5x5 grid of 25 points to classify. Each point requires verification, and we have some suggestions on how to efficiently confirm or update classifications. . You can review, accept, or adjust the labels with just a few clicks. It’s fast, intuitive, and designed to work seamlessly with the rest of your coral reef survey data. And then, after you ‘Save changes’, you can see the summary metrics across all the images of % hard coral cover and other high-level benthic groups (e.g., soft coral, macroalgae, turf, etc).
Here’s what it looks like in action:
Powered by Open Science and Collaboration
So how does the AI know what it’s looking at in each image?
Behind the scenes, MERMAID AI uses a shared classification model trained on 16 million public CoralNet images and feature classifications from public sources in CoralNet. This training data powers a deep-learning engine developed with CoralNet computer vision scientists from the University of California San Diego and Scripps Institution of Oceanography.
To make CoralNet’s model work for MERMAID, we translated hundreds of CoralNet labels into their MERMAID taxonomy equivalents and adapted CoralNet’s Pyspacer modelling framework into a single, unified model for MERMAID AI. Because both MERMAID and CoralNet are open-source platforms, this collaboration has improved performance and features for users of both tools, and is a great example of how collaboration on open-source software can lead to innovation and shared benefits, not competition.
Verifying and Improving Classifications
When you upload your photo quadrats, the MERMAID AI model gets to work — but you’re still in control. For each of the 25 classification points per image, the AI suggests a label if its confidence is over 50%. You’ll see the top prediction plus the next three most likely guesses. From there, you can:
Confirm the AI’s suggestion
Select one of the other suggested options
Or manually choose any label from the MERMAID benthic taxonomy with just a few keystrokes
This human-in-the-loop design is central to how we improve the model over time. Your verified classifications help train better models — and the more you contribute, the better MERMAID AI becomes for everyone.
Model Performance and Accuracy
So does the model work? Yes — and we’re still improving it.
MERMAID Lead Analyst Iain Caldwell and Sparkgeo Data Scientist Lauren Yee spent months training and testing models to recognize both broad benthic groups (like hard coral, soft coral, and macroalgae) and specific coral genera (such as Acropora, Porites, and Pocillopora), including their growth forms (e.g., Porites branching vs. Porites massive).
The current model can identify 54 attributes with useful precision and accuracy:
12 top-level benthic categories
37 coral genera with growth forms
Open Data, Open Future
One of the things we’re most proud of with MERMAID Image Classification (Beta) is how we’re building for the future. Supported by a 2025 AWS Imagine Grant, we’re launching a public image and annotation repository — a “public bucket” where all images and classification data will be available for researchers and developers to build better machine learning models.
We’re making these data fully open and anonymous. That means:
All image annotations are public for model development
Only MERMAID users can see classified results linked to location, date, and site information
We believe this strikes the right balance between protecting user privacy and advancing global coral reef science. You can read more in our updated Terms of Service.
What’s Next for MERMAID AI
We’re not stopping here. With AWS, we’re already building the next generation of MERMAID AI — one that’s fully cloud-native. Using tools like AWS SageMaker and MLFlow, we’ll be able to train, test, and deploy new models rapidly as users upload new imagery and annotations.
In the future, this could allow us to:
Offer regional or habitat-specific models
Benchmark model performance through automated accuracy reports
And even crowdsource global reef imagery from scuba divers and citizen scientists
Our long-term goal? Grow reef monitoring coverage from less than 15% of the world’s coral reefs today to 100% by 2035. This vision is supported by our selection as a Phase I grantee of the Bezos Earth Fund AI Grand Challenge for Climate and Nature — and we can’t wait to share what comes next.
Thank You
We hope you're as excited as we are to integrate the latest AI and machine learning tools into MERMAID. This launch is just the beginning — and it’s only possible thanks to our incredible global user community.
Special thanks to:
WCS field programs and coral reef scientists for field testing and feedback.
Al Leonard (UX), Alexandra Kler Lago (Field Support), and Amkieltiela (User Community).
Stacy Davis, our new Senior Software Engineer — this is her first major feature launch!
Iain Caldwell (Lead Data Analyst) for refining inputs for training the initial MERMAID Image Classification model.
Rah Chalfant for product marketing and communications.
Learn more about the MERMAID team who makes this all possible.
And of course, deep gratitude to our funders who share our belief in open-source, accessible, and powerful tools for coral reef science: Bloomberg Philanthropies, Paul M. Angell Family Foundation, the Tiffany & Co. Foundation, EDB, and AWS Imagine Grant.
With appreciation,
Emily & Kim