I’m trying to read Avro data with complex schema in Kafka topic. I have following schema:
{
"type": "record",
"name": "MyAvroData",
"fields": [
{
"name": "name",
"type": "long"
},
{
"name": "age",
"type": "long"
},
{
"name": "content",
"type": [
{
"type": "record",
"name": "kinda_data_field",
"fields": [
{
"name": "a",
"type": "int"
},
{
"name": "b",
"type": ["null", "int"]
}
]
},
{
"type": "record",
"name": "kinda_another_field",
"fields": [
{
"name": "stuff",
"type": "long"
}
]
},
{
"type": "record",
"name": "third_field",
"fields": [
{
"name": "one",
"type": "float"
},
{
"name": "two",
"type": "float"
}
]
}
]
}
]
}
When I’m converting json schema with AvroSchemaConverter
to SQL DataTypes I have following SQL schema :
`name` BIGINT NOT NULL,
`age` BIGINT NOT NULL,
`content` RAW('java.lang.Object', ?) NOT NULL
Obviously content RAW('java.lang.Object', ?) NOT NULL
not working throwing an exception ParseException: Encountered "?"
.
My questions is:
- How I can read my data using Flink SQL Client ?
- What Data type should I use ?
Any help will be appreciated