Skip to main content

Use open-source AI in your Node.js apps, up to 67x faster.

EnergeticAI is TensorFlow.js, optimized for serverless environments, with fast cold-start, small module size, and pre-trained models.

Install in seconds, and scale with business-friendly licensing.

import { initModel, distance } from "@energetic-ai/embeddings";
import { modelSource } from '@energetic-ai/model-embeddings-en';

(async () => {
const model = await initModel(modelSource);
const [hello, world] = await model.embed([
"hello",
"world"
]);
console.log(distance(hello, world));
})();
Pre-trained embeddings for recommendations and more.

Models & Libraries

Hit the ground running with pre-trained models.

EmbeddingsEnglish

Build recommendations and more with sentence embeddings.

ClassifiersEnglish

Classify text into categories with just a few training examples.

Semantic SearchEnglishPlanned

Provide answers based on meaning with question-answering models.

Performance

EnergeticAI maximizes cold-start performance while minimizing module size.

  • Cold-Start Speed
  • Warm-Start Speed
  • Module Size
3711 ms2239 ms2335 ms55 msEnergeticAI with Bundled Model1,0002,0003,000EnergeticAI1,0002,0003,000Tensorflow.js with Node.js Backend1,0002,0003,000Tensorflow.js1,0002,0003,000

Up to

67x faster

Compared to Tensorflow.js

Inference speed in serverless functions is dominated by cold-start speed.

This benchmark initializes the model and computes an embedding for a 5-sentence paragraph, on an M1 Max Macbook Pro.

View Benchmark

Usability

Install in seconds, and scale with business-friendly licensing.

Download EnergeticAI from NPM:

npm install @energetic-ai/core

Requires Node 18+. EnergeticAI is Apache 2.0 licensed, though dependencies may differ.

Get Started →