DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workkloads.

Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Streamlining Event Data in Event-Driven Ansible
  • Setting Up Your First Event-Driven Automation With Ansible
  • AWS CloudTrail Monitoring Using Event-Driven Ansible
  • How to Integrate Event-Driven Ansible With Kafka

Trending

  • DZone's Article Submission Guidelines
  • Docker Base Images Demystified: A Practical Guide
  • How Large Tech Companies Architect Resilient Systems for Millions of Users
  • Unlocking AI Coding Assistants Part 4: Generate Spring Boot Application
  1. DZone
  2. Data Engineering
  3. Data
  4. Clean Up Event Data in Ansible Event-Driven Automation

Clean Up Event Data in Ansible Event-Driven Automation

Clean and normalize event data in Ansible Event-Driven Automation (EDA) with the ansible.eda.dashes_to_underscores filter for smoother, more reliable automation.

By 
Binoj Melath Nalinakshan Nair user avatar
Binoj Melath Nalinakshan Nair
DZone Core CORE ·
Apr. 25, 25 · Tutorial
Likes (5)
Comment
Save
Tweet
Share
5.6K Views

Join the DZone community and get the full member experience.

Join For Free

In the past few articles, we explored how to use different event sources in Ansible Event-Driven Automation (EDA). In this demo, we'll focus on how event filters can help clean up and simplify event data, making automation easier to manage. Specifically, we'll explore the ansible.eda.dashes_to_underscores event filter and how it works.

When using Ansible EDA with tools like webhooks, Prometheus, or cloud services, events often come in as JSON data. These JSON payloads usually have keys with dashes in their names, like alert-name or instance-id. While this is fine in JSON, it becomes a problem in Ansible because variable names with dashes can't be used directly in playbooks or Jinja2 templates. The dashes_to_underscores filter helps solve this issue by converting those dashed keys into names that Ansible can work with more easily.

The dashes_to_underscores filter in Ansible EDA automatically replaces dashes (-) in all keys of an event payload with underscores (_). This transformation ensures that variable names conform to Ansible's requirements, making them easier to reference directly in conditions and playbooks.

Testing the dashes_to_underscores Filter with a Webhook

To demonstrate how the ansible.eda.dashes_to_underscores filter operates, we'll send a sample JSON payload to a webhook running on port 9000. This payload includes keys with dashes, such as alert-name and instance-id, which are common in JSON data but can pose challenges in Ansible due to variable naming conventions.

webhook.yml

YAML
 
- name: Event Filter dashes_to_underscores  demo
  hosts: localhost
  sources:
    - ansible.eda.webhook:
        port: 9000
        host: 0.0.0.0
      filters:
        - ansible.eda.dashes_to_underscores:
  rules:
    - name: Run the playbook if it alert_name matches High CPU Usage
      condition: event.payload.alert_name == "HighCPUUsage"
      action:
        run_playbook:
          name: print-event-vars.yml

print-event-vars.yml

YAML
 
---
- name: Print the Event Details
  hosts: localhost
  gather_facts: false
  tasks:
    - name: Print event details
      debug:
        msg: >
          Detected that '{{ ansible_eda.event.payload.alert_name }}' on '{{ ansible_eda.event.payload.instance_id }}', current system timestp '{{ now(fmt='%Y-%m-%d %H:%M:%S') }}'

Triggering the Webhook with a Sample JSON Payload

Shell
 
curl --header "Content-Type: application/json" \
     --request POST \
     --data '{ "alert-name": "HighCPUUsage", "instance-id": "i-123456789" }' \
     http://localhost:9000/

Inspecting Events with --print-events Flag

Upon receiving this payload, the dashes_to_underscores filter will automatically convert the keys: alert-name becomes alert_name and instance-id becomes instance_id

This transformation ensures compatibility with Ansible's variable naming requirements, allowing for straightforward access to these variables in your playbooks and rulebooks.

JSON
 
Ruleset: Event Filter dashes_to_underscores  demo
Event:
{'meta': {'endpoint': '',
          'headers': {'Accept': '*/*',
                      'Content_Length': '62',
                      'Content_Type': 'application/json',
                      'Host': 'localhost:9000',
                      'User_Agent': 'curl/8.7.1'},
          'received_at': '2025-04-23T16:20:40.248651Z',
          'source': {'name': 'ansible.eda.webhook',
                     'type': 'ansible.eda.webhook'},
          'uuid': '8dcf64a6-2ab2-429b-a81a-6dd0e3a48308'},
 'payload': {'alert_name': 'HighCPUUsage', 'instance_id': 'i-123456789'}}


Screenshot of Ansible Print Event Details

Conclusion

The ansible.eda.dashes_to_underscores filter helps clean up event data in Ansible's EDA. It automatically changes keys with dashes (like alert-name) into keys with underscores (like alert_name), making them easier to use in Ansible playbooks and rulebooks. This is important because Ansible prefers variable names that only have letters, numbers, and underscores. Using this filter makes your automation scripts clearer and reduces the chance of errors. We can also combine it with other filters, like json_filter, to better control and format incoming event data. Overall, this filter helps you work more smoothly with different types of event data in automation tasks.

Note: The views expressed in this article are my own and do not necessarily reflect the views of my employer.

Ansible (software) Data (computing) Event

Opinions expressed by DZone contributors are their own.

Related

  • Streamlining Event Data in Event-Driven Ansible
  • Setting Up Your First Event-Driven Automation With Ansible
  • AWS CloudTrail Monitoring Using Event-Driven Ansible
  • How to Integrate Event-Driven Ansible With Kafka

Partner Resources

×

Comments

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: