ZETIC.MLange

Future Works

Upcoming features and improvements for ZETIC.MLange

ZETIC.MLange is continuously evolving. We are actively researching the following technologies and will be updating them very soon.

1. LLM NPU Support

We are optimizing Large Language Models to fully utilize NPUs through advanced graph optimization and automated model splitting.

Graph Optimization

Advanced graph optimization techniques are applied to the LLM to prepare it for efficient NPU execution.

Model Splitting

The large model is intelligently split into chunks that fit within the NPU's memory constraints.

NPU Compilation

Each model chunk is compiled specifically for the target NPU architecture.

Hybrid RuntimeExecution

The ZETIC.MLange runtime orchestrates the execution, leveraging the NPU for heavy computations and the CPU for fallback operations.

2. NPU Hardware Support Update

We are actively expanding our high-performance NPU support to a wider range of hardware platforms.

Target Platforms

  • Mobile: Samsung Exynos, Google Edge TPU
  • Microcontrollers: NXP, Infineon

3. On-device RAG (Sneak Peek)

Enhance model accuracy and relevance with local knowledge retrieval.

Local RAG API Preview
// Initialize RAG Pipeline
val rag = ZeticRAGPipeline(
    retriever = LocalVectorDB(path = "docs.db"),
    generator = ZeticMLangeModel("llama-3.2-1b")
)

// Retrieve and Generate
val response = rag.query("How do I use ZETIC.MLange?")

4. On-device MCP (Sneak Peek)

Standardized context management for robust AI agents.

Agent State Management Preview
// Define tool resource
val weatherTool = Tool("get_weather") { location -> 
    // ... implementation ...
}

// Pass tools to context
val agent = ZeticAgent(
    model = model,
    tools = listOf(weatherTool)
)

agent.chat("What's the weather in Seoul?")

5. On-device Multimodal NPU Accelerate Support

We are enabling support for Multimodal models that process text, images, and audio simultaneously through a unified encoder architecture. This allows for rich, interactive user experiences where the device can "see" and "hear" its environment.

These groundbreaking features are coming very soon. Stay tuned!


Contact Us

Collaborations and update requests are always welcome! Please contact us at contact@zetic.ai to discuss your needs or share feedback.