Our methodology
The cornerstone of our vision is that the ocean is computable and that a world with ocean certainty is possible.
The problem#
The global economy runs on the ocean, but operates in an environment of massive uncertainty. Why is the ocean the last great data gap on Earth?
For the last 50 years, weather modeling has de-risked aviation, agriculture, and energy, yet ocean forecasts have not followed this same pace of progress.
We believe the disparity is an issue of scale. Ocean "weather" is orders of magnitude smaller than in the atmosphere but the main source of ocean forecasts are global models with spatial resolution that's comparable or even worse than weather models.
Global ocean models have a grid resolution of about 9 km but hazardous currents in the world's key ports are often as small as 200 meters wide. Modeling this globally would require an estimated 10,000x increase in computational power (and cost) which is an impossible increase in scale.
Satellites can't measure these currents and machine learning can't solve this problem because there isn't enough fine-scale coastal data for training. We're forced to design a different approach.
The solution#
Current Lab achieves superior ocean accuracy and precision by running a global mosaic of distributed, high-resolution regional models focused on coastal zones and key chokepoints.
The advantages of this approach:
- Computational power is directed towards regions where existing accuracy is the worst.
- Each model is independent, enabling fine-tuning of local physics.
- Datasets are fused together in post-processing for a gap-free global ocean solution.

Conceptual schematic of the Current Lab global ocean mosaic.
By focusing in on small coastal zones and key chokepoints, we're able to achieve up to 50x finer resolution than traditional global models.
Each model computes the full surface to seafloor physics for up to one million grid points in a single region. This produces accurate predictions of ocean flow interacting with coastlines, rivers, islands, and underwater topography.

The enabling tech stack#
Three necessary components have advanced enough to finally make this the time to solve the problem of ocean forecasting: observational data, ocean science, and computational power.
1. Observational Datasets#
Data from satellites, buoys, autonomous vehicles, and more all contribute to accurate ocean forecasting. Current Lab's data processing layer ingests a wide variety of datasets for use in model configuration, model forcing, and model skill analysis/tuning. As new sources become available, we continue expanding our network of data partners to stay at the bleeding edge of ocean intelligence.

2. Ocean Physics Models #
Our powerful 3D hydrodynamic models generate gap-free surface to seafloor predictions of ocean conditions. Observational datasets are used for some forcing conditions and to test model accuracy but are fundamentally too sparse to be used on their own for actual prediction.
By simulating ocean physics from first principles, we are uniquely suited to predicting the effects of novel conditions like extreme storms and climate change. If a new shipping channel is dredged or even a new island is built, we can incorporate it into our models immediately without waiting for any new measurements or ML training.

3. Cloud Computing#
The increasing power and affordability of cloud computing makes it possible to scale up to our goal of a global mosaic of high-resolution ocean models. In the recent past, even running a single one of our regional models would have required a specialized computer cluster only accessible to universities or government agencies like NOAA. Today, we can scale up rapidly and run our proprietary forecast models 365 days per year without any on-site supercomputer.
