Endpoints
Endpoints
Setting Up and Communicating with Endpoints on the Nosana Network
When creating a deployment on the Nosana network, you can expose services that will be accessible via endpoints. This means an instance will be accessible via an endpoint with which you can communicate.
This guide will walk you through setting up an Nginx server and interacting with its endpoint. Afterwards, we will set up a Llama instance and start communicating with it.
Proof of concept: Nginx
Nginx is a high-performance web server and reverse proxy server that is widely used for serving static content, load balancing, and handling HTTP and HTTPS traffic.
It'll be a good proof of concept to showcase how to use a Nosana endpoint.
Setting Up an Nginx Server
Step 1: Create a Deployment
First, create a deployment with a job definition that exposes port 80 for Nginx.
const deployment = await client.deployments.create({
name: 'Nginx Server',
market: '97G9NnvBDQ2WpKu6fasoMsAKmfj63C9rhysJnkeWodAf', // NVIDIA 4090
timeout: 60, // 60 minutes
replicas: 1,
strategy: 'SIMPLE',
job_definition: {
"version": "0.1",
"type": "container",
"meta": {
"trigger": "api"
},
"ops": [
{
"type": "container/run",
"id": "nginx",
"args": {
"cmd": [],
"image": "nginx",
"expose": 80
}
}
]
}
});curl -X POST "https://dashboard.k8s.prd.nos.ci/api/deployments/create" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $NOSANA_API_KEY" \
-d '{
"name": "Nginx Server",
"market": "97G9NnvBDQ2WpKu6fasoMsAKmfj63C9rhysJnkeWodAf",
"timeout": 60,
"replicas": 1,
"strategy": "SIMPLE",
"job_definition": {
"version": "0.1",
"type": "container",
"meta": {
"trigger": "api"
},
"ops": [
{
"type": "container/run",
"id": "nginx",
"args": {
"cmd": [],
"image": "nginx",
"expose": 80
}
}
]
}
}'Step 2: Start the Deployment
After creating the deployment, you need to start it for it to actually execute.
// Get the deployment and start it
const deployment = await client.deployments.get(deploymentId);
await deployment.start();curl -X POST \
-H "Authorization: Bearer $NOSANA_API_KEY" \
https://dashboard.k8s.prd.nos.ci/api/deployments/<deployment_id>/startStep 3: Retrieve the Deployment
Once the deployment is running, retrieve it to get the endpoints. The deployment object includes an endpoints array with the service URLs.
// Retrieve the deployment to get endpoints
const deployment = await client.deployments.get(deploymentId);curl -s \
-H "Authorization: Bearer $NOSANA_API_KEY" \
https://dashboard.k8s.prd.nos.ci/api/deployments/<deployment_id> | jq .Step 4: Access the Service Endpoint
The deployment object contains the service URLs in the endpoints field. You can now access your Nginx service at this URL.
// Access the service endpoint from the deployment
const serviceUrl = deployment.endpoints[0]?.url;# First, retrieve the deployment to get the endpoint URL
DEPLOYMENT_RESPONSE=$(curl -s \
-H "Authorization: Bearer $NOSANA_API_KEY" \
https://dashboard.k8s.prd.nos.ci/api/deployments/<deployment_id>)
# Extract the service URL from the endpoints array
SERVICE_URL=$(echo $DEPLOYMENT_RESPONSE | jq -r '.endpoints[0].url')Info
The service endpoint URL will be in the format:https://<id>.node.k8s.prd.nos.ci
You can find this URL in the endpoints array of the deployment object. Each endpoint includes:
opId: The operation ID from your job definitionport: The exposed port numberurl: The full service URL
Navigate to the service URL to find your Nginx service.
Success! Your Nginx instance is running on the Nosana Network.