Over a million developers have joined DZone.

Generating AVRO Schemas for Data and Making Sure Names Are Correct

DZone's Guide to

Generating AVRO Schemas for Data and Making Sure Names Are Correct

Learn how to use Apache NiFi to generate AVRO schemas while ensuring that the field names meet strict naming conventions.

· Big Data Zone ·
Free Resource

How to Simplify Apache Kafka. Get eBook.

Building schemas is tedious work and is often fraught with errors. The InferAvroSchema processor can get you started. It generates a compliant schema for use. There is one caveat: you have to make sure you are using Apache Avro-safe field names. I have a custom processor that will clean your attributes if you need them to be Avro-safe. See the processor listed below.

Example flow utilizing InferAvroSchema:

InferAvroSchema details:

The steps are as follows:

  1. Use Apache NiFi to convert data to JSON or CSV.

  2. Send JSON or CSV data to InferAvroSchema. I recommend setting the output destination to flowfile-attribute, input content type to json, and the pretty Avro output to true.

  3. The new schema is now in the following attribute format: inferred.avro.schema.

{ "type" : "record", "name" : "schema1", 
 "fields" : [ { 
   "name" : "table", "type" : "string", 
   "doc" : "Type inferred from '\"schema1.tableName\"'" } ] 

This schema can then be used for conversions directly or can be stored in Hortonworks Schema Registry or Apache NiFi's built-in Avro Registry.

Now, you can use it for ConvertRecordQueryRecord, and other Record processing.

Example generated schema in Avro-JSON format stored in Hortonworks Schema Registry:


And that's it!

12 Best Practices for Modern Data Ingestion. Download White Paper.

big data ,hortonworks ,tutorial ,apache nifi ,apache avro ,naming conventions

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}