In the previous part, we built an Acorn Service that exposes an instance of RabbitMQ. In the final part, we will build two Acorns – a publisher that periodically publishes messages to the topic and a subscriber that reads from the same topic.
Ensure the Acorn Service is running before proceeding to the next steps.
Step 1 – Create an Acorn Job acting as a Publisher
Create a directory called pub
to store all the relevant files for the publisher.
Under src, create the below Python script app.py
that publishes to a topic called Hello
:
import pika, os
# Access the CLOUDAMQP_URL environment variable and parse it (fallback to localhost)
url = os.environ.get('CLOUDAMQP_URL')
params = pika.URLParameters(url)
connection = pika.BlockingConnection(params)
channel = connection.channel() # start a channel
channel.queue_declare(queue='hello') # Declare a queue
channel.basic_publish(exchange='',
routing_key='hello',
body='Hello CloudAMQP!')
print(" [x] Sent 'Hello World!'")
connection.close()
Notice that the script receives the URL with the RabbitMQ endpoint and credentials through the environment variable.
Now, create the Dockerfile for the Python script.
FROM python:alpine3.9
RUN pip install pika==1.1.0
COPY ./src /src
ENTRYPOINT ["python","-u","/src/app.py"]
It’s time to define the Acorn that runs this container as a Job. It will be scheduled to publish a message every minute.
services: "rabbitmq-cloudamqp-server": {
external: "rabbitmq-01"
}
jobs:{
"rabbitmq-pub": {
build: context: "."
env: {
CLOUDAMQP_URL: "@{service.rabbitmq-cloudamqp-server.data.url}"
}
schedule: "* * * * *"
}
}
We are referencing the Acorn Service already running in the services
section of the Acornfile
. We then define the job and pass the CLOUDAMQP URL exposed by the Service as an environment variable to the Python script.
Run the Acornfile to kick off the job.
acorn run -n publisher .
You can see that the job is publishing the messages from the logs:
acorn logs -f publisher
While this is running, let’s create the subscriber.
Step 2 – Create an Acorn acting as a Subscriber
Create a directory called sub
to store all the relevant files for the subscriber.
Under src, create the below Python script, app.py
, which subscribes to the topic called Hello
:
import pika, os
# Access the CLOUDAMQP_URL environment variable and parse it (fallback to localhost)
url = os.environ.get('CLOUDAMQP_URL')
params = pika.URLParameters(url)
connection = pika.BlockingConnection(params)
channel = connection.channel() # start a channel
channel.queue_declare(queue='hello') # Declare a queue
def callback(ch, method, properties, body):
print(" [x] Received " + str(body))
channel.basic_consume('hello',
callback,
auto_ack=True)
print(' [*] Waiting for messages:')
channel.start_consuming()
Let’s create the Dockerfile and Acornfile to run the subscriber.
FROM python:alpine3.9
RUN pip install pika==1.1.0
COPY ./src /src
ENTRYPOINT ["python","-u","/src/app.py"]
services: "rabbitmq-cloudamqp-server": {
external: "rabbitmq-01"
}
containers:{
"rabbitmq-sub": {
build: context: "."
env: {
CLOUDAMQP_URL: "@{service.rabbitmq-cloudamqp-server.data.url}"
}
}
}
Let’s run the subscriber and watch the logs if the messages are received by it.
acorn run -n subscriber .
acorn logs -f subscriber
We can see the messages flowing from the publisher to the subscriber via the RabbitMQ instance running in the cloud.
This tutorial walked you through all the steps involved in publishing Acorn Services and consuming them from multiple Acorns. If you’d like to learn more about getting started with Acorn, view our getting started workshop.
Janakiram is a practicing architect, analyst, and advisor focusing on emerging infrastructure technologies. He provides strategic advisory to hyperscalers, technology platform companies, startups, ISVs, and enterprises. As a practitioner working with a diverse Enterprise customer base across cloud native, machine learning, IoT, and edge domains, Janakiram gains insight into the enterprise challenges, pitfalls, and opportunities involved in emerging technology adoption. Janakiram is an Amazon, Microsoft, and Google certified cloud architect, as well as a CNCF Ambassador and Microsoft Regional Director. He is an active contributor at Gigaom Research, Forbes, The New Stack, and InfoWorld. You can follow him on twitter.
Header Photo by Lukasz Szmigiel on Unsplash