Models & Libraries
Hit the ground running with pre-trained models.
EnergeticAI maximizes cold-start performance while minimizing module size.
- Cold-Start Speed
- Warm-Start Speed
- Module Size
Compared to Tensorflow.js
Inference speed in serverless functions is dominated by cold-start speed.
This benchmark initializes the model and computes an embedding for a 5-sentence paragraph, on an M1 Max Macbook Pro.
Install in seconds, and scale with business-friendly licensing.
Download EnergeticAI from NPM:
npm install @energetic-ai/core
Requires Node 18+. EnergeticAI is Apache 2.0 licensed, though dependencies may differ.Get Started →