Skip to main content
Use this page if you are building for the AI track and want a practical model for how AI should fit into your Initia appchain architecture. You do not need to run model inference onchain. In most projects, inference will happen offchain, with the appchain handling the parts of the product that need blockchain guarantees.

Use AI Offchain

Generate or transform content, make recommendations, power copilots or agents, and run classification or summarization.

Use the Appchain

Store ownership and state, handle rewards and payments, enforce access or reputation, and coordinate marketplaces or escrow.

External AI Services

You may use hosted model APIs in your project. If your project depends on a third-party AI provider, plan to supply your own API key or backend configuration for development and demos. For most hackathon teams, hosted APIs are the fastest and most common way to add AI features.
API key security: If your project uses an external AI API, store secrets in a local .env file or another secure secret manager. Do not commit API keys to GitHub or expose them in your demo video, screenshots, or frontend code.
If you need to add a model provider to your app, start with the official API docs for your chosen service:

OpenAI

API docs

Anthropic

API docs

Google Gemini

API docs
For demos, it is acceptable to use mocked, cached, or pre-generated AI outputs if you clearly disclose that setup. Judges will evaluate the product, appchain integration, and overall user experience, not whether the model itself is hosted onchain.