How Do Data Rows and Compute Credits Work?
Learn what data rows and compute credits are, what consumes them, and how to plan your resource usage in Datature Vi.
Datature Vi uses two resource currencies: data rows for storage and annotation, and compute credits for GPU training time. Your plan sets a quota for each. This page explains what consumes each resource, how to estimate costs before training, and what happens when you hit a limit.
Two resource types
Both resources are tracked at the organization level, shared across all projects and team members. You can view current usage in Settings > Resources. See Resource Usage for the full monitoring guide.
What consumes data rows
When you delete an image or remove an annotation pair, the data rows tied to that content go back to your available balance immediately. Check Settings > Resources after large cleanups so your quota matches what you expect.
Data rows are consumed when you add content to your datasets.
Example: Uploading 100 images with 3 annotation pairs each consumes 100 x 5 + 100 x 3 = 800 data rows.
Deleting images or annotation pairs returns the associated data rows to your quota. Uploads still consume data rows on ingest, so curate before you upload when you can.
What consumes compute credits
Compute credits are consumed during training runs. The cost depends on two factors: how long the run takes and which GPU you selected.
Each GPU tier has a multiplier applied to the training duration. These rates match the GPU multiplier table on Resource Usage (the page you use to read live quotas). For multi-GPU jobs, multipliers add per GPU (for example 4× A10G uses 4 × 2.5 = 10.0× per real-time minute).
Calculation: Credits consumed = Training duration (minutes) × GPU multiplier (sum all GPUs when you use more than one).
A 2-hour training run on one A10G GPU costs 120 minutes × 2.5 = 300 compute credits.
Compute credits reset monthly. Unused credits from the current month do not carry over.
How to estimate costs before training
Before launching a training run, estimate your compute credit consumption:
Check your dataset size
Larger datasets take longer to train. A dataset of 100 images trains faster than one with 1,000 images.
Note your training settings
More epochs and larger batch sizes increase training time. Check the defaults in Model Settings for your chosen model.
Pick a GPU tier
Start with the recommendation Datature Vi shows for your model size. Smaller GPUs cost fewer credits per minute but may take longer to train.
Estimate the duration
A rough guide: 100 images on a 7B model with LoRA on an A10G GPU takes about 1 hour. Scale up from there based on your dataset size and epoch count.
What happens when you hit a limit
Hitting a limit never deletes your data, models, or training results. You retain full access to everything you've already created.
Frequently asked questions
Related resources
Updated 5 days ago
