当前位置: 首页 > 软件库 > 云计算 > 云原生 >

schema-registry

Confluent Schema Registry for Kafka
授权协议 View license
开发语言 Java
所属分类 云计算、 云原生
软件类型 开源软件
地区 不详
投 递 者 卫甫
操作系统 跨平台
开源组织
适用人群 未知
 软件概览

Schema Registry

Confluent Schema Registry provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving your Avro®, JSON Schema, and Protobuf schemas. It stores a versioned history of all schemas based on a specified subject name strategy, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility settings and expanded support for these schema types. It provides serializers that plug into Apache Kafka® clients that handle schema storage and retrieval for Kafka messages that are sent in any of the supported formats.

This README includes the following sections:

Documentation

Here are a few links to Schema Registry pages in the Confluent Documentation.

Quickstart API Usage examples

The following assumes you have Kafka and an instance of the Schema Registryrunning using the default settings. These examples, and more, are also available at API Usage examples on docs.confluent.io.

# Register a new version of a schema under the subject "Kafka-key"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \
    http://localhost:8081/subjects/Kafka-key/versions
  {"id":1}

# Register a new version of a schema under the subject "Kafka-value"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \
     http://localhost:8081/subjects/Kafka-value/versions
  {"id":1}

# List all subjects
$ curl -X GET http://localhost:8081/subjects
  ["Kafka-value","Kafka-key"]

# List all schema versions registered under the subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions
  [1]

# Fetch a schema by globally unique id 1
$ curl -X GET http://localhost:8081/schemas/ids/1
  {"schema":"\"string\""}

# Fetch version 1 of the schema registered under subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions/1
  {"subject":"Kafka-value","version":1,"id":1,"schema":"\"string\""}

# Fetch the most recently registered schema under subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions/latest
  {"subject":"Kafka-value","version":1,"id":1,"schema":"\"string\""}

# Delete version 3 of the schema registered under subject "Kafka-value"
$ curl -X DELETE http://localhost:8081/subjects/Kafka-value/versions/3
  3

# Delete all versions of the schema registered under subject "Kafka-value"
$ curl -X DELETE http://localhost:8081/subjects/Kafka-value
  [1, 2, 3, 4, 5]

# Check whether a schema has been registered under subject "Kafka-key"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \
    http://localhost:8081/subjects/Kafka-key
  {"subject":"Kafka-key","version":1,"id":1,"schema":"\"string\""}

# Test compatibility of a schema with the latest schema under subject "Kafka-value"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \
    http://localhost:8081/compatibility/subjects/Kafka-value/versions/latest
  {"is_compatible":true}

# Get top level config
$ curl -X GET http://localhost:8081/config
  {"compatibilityLevel":"BACKWARD"}

# Update compatibility requirements globally
$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"compatibility": "NONE"}' \
    http://localhost:8081/config
  {"compatibility":"NONE"}

# Update compatibility requirements under the subject "Kafka-value"
$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"compatibility": "BACKWARD"}' \
    http://localhost:8081/config/Kafka-value
  {"compatibility":"BACKWARD"}

Installation

You can download prebuilt versions of the schema registry as part of theConfluent Platform. To install from source,follow the instructions in the Development section.

Deployment

The REST interface to schema registry includes a built-in Jetty server. Thewrapper scripts bin/schema-registry-start and bin/schema-registry-stopare the recommended method of starting and stopping the service.

Development

To build a development version, you may need a development versions ofcommon andrest-utils. Afterinstalling these, you can build the Schema Registrywith Maven.

This project uses the Google Java code styleto keep code clean and consistent.

To build:

mvn compile

To run the unit and integration tests:

mvn test

To run an instance of Schema Registry against a local Kafka cluster (using the default configuration included with Kafka):

mvn exec:java -pl :kafka-schema-registry -Dexec.args="config/schema-registry.properties"

To create a packaged version, optionally skipping the tests:

mvn package [-DskipTests]

It produces:

  • Schema registry in package-schema-registry/target/kafka-schema-registry-package-$VERSION-package
  • Serde tools for avro/json/protobuf in package-kafka-serde-tools/target/kafka-serde-tools-package-$VERSION-package

Each of the produced contains a directory layout similar to the packaged binary versions.

You can also produce a standalone fat JAR of schema registry using the standalone profile:

mvn package -P standalone [-DskipTests]

This generates package-schema-registry/target/kafka-schema-registry-package-$VERSION-standalone.jar, which includes all the dependencies as well.

OpenAPI Spec

OpenAPI (formerly known as Swagger) specifications are built automatically using swagger-maven-pluginon compile phase.

Contribute

Thanks for helping us to make Schema Registry even better!

License

The project is licensed under the Confluent Community License, except for client libs,which is under the Apache 2.0 license.See LICENSE file in each subfolder for detailed license agreement.

  • 笔者用confluent的schema-registry来实现Avro格式kafka消息的发送和接收。 但是当发送的消息中的字段(导致了schema变化了)增加了的时候报错了: Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"recor

  • flink-1.14编译报错 could not found kafka-schema-registry-client-5.5.2.jar 包@TOC 解决方案: 进入http://packages.confluent.io/maven/io/confluent/kafka-schema-registry-client/5.5.2/ 下载对应包即可

  • 已付费购买专栏的朋友,请在申请查看权限时,备注你的CSDN名称才能通过查看权限,谢谢配合,谢谢理解。如未购买,请购买专栏,再申请。 更多详情请见以下链接: 【腾讯文档】Flink1.14 源码编译报错Could not transfer artifact io.confluentkafka-schema-registry-client fromto maven地址

  • 编译报错信息 Could not find artifact io.confluent:kafka-schema-registry-client:pom:3.3.1 in aliyunmaven (https://maven.aliyun.com/repository/public) 下载地址: http://packages.confluent.io/maven/io/confluent/ka

  • 配置 一,下载confluent安装包,解压到linux目录,进入etc/kafka-rest/kafka-rest.properties kafka-rest.properties配置 id=kafka-rest-test-server schema.registry.url=http://192.168.237.136:8081 zookeeper.connect=192.168.237.13

  •   https://blog.csdn.net/lzufeng/article/details/81566766 curl -X PUT -H "Content-Type:application/json" https://xxx:9081/config -d '{"compatibility": "NONE"}' 转载于:https://www.cnblogs.com/tonggc1668/p/

  • Schema Registry Schema Registry为你的元数据信息提供服务层。它提供了RESTFUL接口用于存储Avro schemas,它存有版本化的历史记录,提供多个兼容性设置,并根据兼容性设置对schema进行调整。   API Schemas GET /schemas/ids/{int:id} 通过input id获取schema string,id是schema的唯一标识符

  • 很多时候在流数据处理时,我们会将avro格式的数据写入到kafka的topic,但是avro写入到kafka的时候,数据有可能会与版本升级,也就是schema发生变化,此时如果消费端,不知道哪些数据的schema是使用升级前的,哪些数据schema使用升级后,此次消费端一旦就经常会跑出异常,为了避免schema解析时出现异常,就不得不得不使得数据自动过期或者删除kafka的topic(重新新建to

 相关资料
  • // 创建指定数据表 Schema::create('table', function($table) { $table->increments('id'); }); // 指定一个连接 Schema::connection('foo')->create('table', function($table){}); // 通过给定的名称来重命名数据表 Schema::rename($from,

  • schema 是一个用于验证 Python 数据结构的库,例如从配置文件、表单、外部服务或命令行解析中获得的数据,从JSON/YAML(或其他东西)转换为 Python 数据类型。

  • 要使用本节中描述的AOP命名空间标记,您需要按照描述导入springAOP模式 - <?xml version = "1.0" encoding = "UTF-8"?> <beans xmlns = "http://www.springframework.org/schema/beans" xmlns:xsi = "http://www.w3.org/2001/XMLSchema-insta

  • 主要内容:什么是 JSON Schema,定义 Schema,使用 JSON Schema 进行验证JSON Schema 是一个描述和验证 JSON 数据结构的强大工具,我们可以把 JSON Schema 看作是一种规范,这个规范中规定了 JSON 数据的结构、键的命名、值的类型等等,通过规范可以校验指定的 JSON 数据,保证数据的准确。所以在接口调试过程中,经常使用 JSON Schema 来校验接口数据的准确性。 什么是 JSON Schema JSON Schema 译为“JSON模式

  • Schema 是一个 Clojure(Script) 库,用来声明数据描述和验证。 代码示例: (ns schema-examples  (:require [schema.core :as s             :include-macros true ;; cljs only             ]))(def Data  "A schema for a nested data ty

  • Schema 是一个用于验证 Python 数据结构的库,比如从配置文件、表单、外部服务或者命令行解析到的,从 JSON / YAML(或其他)转换为 Python 数据类型的库。 示例 这里有一个快速示例来感受 Schema 能做的事,验证具有个人信息的条目列表: >>> from schema import Schema, And, Use, Optional>>> schema = Sche