我试图为每个绑定设置valueSerde,但是只考虑默认的valueSerde。
应用程序类
public class AppSerdes {
public static final class DepartmentSerde extends WrapperSerde<Department> {
public DepartmentSerde() {
super(new ProtobufSerializer<>(), new ProtobufDeserializer<>(Department.class));
}
}
public static final class EmployeeSerde extends WrapperSerde<Employee> {
public EmployeeSerde() {
super(new ProtobufSerializer<>(), new ProtobufDeserializer<>(Employee.class));
}
}
public static final class DepartmentDataSerde extends WrapperSerde<DepartmentData> {
public DepartmentDataSerde() {
super(new ProtobufSerializer<>(), new ProtobufDeserializer<>(DepartmentData.class));
}
}
}
StreamsConfig。JAVA
@Configuration
@Slf4j
public class StreamsConfiguration {
@Bean
public BiFunction<KStream<String, Employee>, KStream<String, Department>, KStream<String, DepartmentData>> process() {
return (Employee, Department) -> Employee.leftJoin(Department, (v1, v2) -> {
if (v2 == null) {
log.info("No Department is present");
return null;
} else {
var data = DepartmentData.newBuilder();
data.setId(v2.getId());
data.setName(v2.getName());
data.addEmployees(v1);
return data.build();
}
}, JoinWindows.of(Duration.ofMinutes(1))).peek((k, v) -> {
log.info("Key->{}, value->{}", k, v);
});
}
}
和应用。yml
spring:
application.name: kafka-join-example
spring.kafka.bootstrap-servers: 192.168.56.101:19092
spring.cloud.stream.kafka.streams.bindings.process-in-0.consumer.valueSerde: io.github.kprasad99.streams.AppSerdes$EmployeeSerde
spring.cloud.stream.kafka.streams.bindings.process-in-1.consumer.valueSerde: io.github.kprasad99.streams.AppSerdes$DepartmentSerde
spring.cloud.stream.kafka.streams.bindings.process-out-0.producer.valueSerde: io.github.kprasad99.streams.AppSerdes$DepartmentDataSerde
spring.cloud.stream:
bindings:
process-in-0:
destination: kp.sch
consumer:
use-native-decoding: true
process-in-1:
destination: kp.lm
consumer:
use-native-decoding: true
process-out-0:
destination: kp.lm.sch
producer:
use-native-encoding: true
kafka.streams.bindings:
process-in-0:
consumer:
value.serde: io.github.kprasad99.streams.AppSerdes$EmployeeSerde
process-in-1:
consumer:
value.serde: io.github.kprasad99.streams.AppSerdes$DepartmentSerde
process-out-0:
producer:
value.serde: io.github.kprasad99.streams.AppSerdes$DepartmentDataSerde
kafka.streams.binder:
brokers:
- 192.168.56.101:19092
# replication-factor: 3
# required-acks: 2
min-partition-count: 5
configuration:
commit.interval.ms: 100
default:
key.serde: org.apache.kafka.common.serialization.Serdes$StringSerde
value.serde: io.github.kprasad99.streams.kafka.serde.ProtobufSerde
非常感谢您对这方面的任何见解。
编辑
完整的示例代码在这里
堆栈跟踪020-12-06 21:55:39.929错误141897---[-StreamThread-1]
o.a.k.s.p.i.AssignedStreamsTasks : stream-thread [kafka-join-example-4aa10729-e5b7-4a33-89c6-906dd8ab2e5d-StreamThread-1] Failed to process stream task 1_4 due to the following error:
org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=1_4, processor=KSTREAM-SOURCE-0000000001, topic=kp.department, partition=4, offset=1, stacktrace=org.apache.kafka.common.errors.SerializationException: java.lang.InstantiationException: No target type provided
Caused by: java.lang.InstantiationException: No target type provided
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:45)
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:1)
at org.apache.kafka.streams.state.StateSerdes.valueFrom(StateSerdes.java:160)
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:56)
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:26)
at org.apache.kafka.streams.kstream.internals.KStreamKStreamJoin$KStreamKStreamJoinProcessor.process(KStreamKStreamJoin.java:100)
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:55)
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:104)
at org.apache.kafka.streams.processor.internals.StreamTask.lambda$process$3(StreamTask.java:383)
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:383)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:475)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:550)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:802)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:697)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:670)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:400) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:475) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:550) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:802) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:697) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:670) ~[kafka-streams-2.5.1.jar:na]
Caused by: org.apache.kafka.common.errors.SerializationException: java.lang.InstantiationException: No target type provided
Caused by: java.lang.InstantiationException: No target type provided
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:45) ~[classes/:na]
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:1) ~[classes/:na]
at org.apache.kafka.streams.state.StateSerdes.valueFrom(StateSerdes.java:160) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:56) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:26) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.kstream.internals.KStreamKStreamJoin$KStreamKStreamJoinProcessor.process(KStreamKStreamJoin.java:100) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:55) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:104) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamTask.lambda$process$3(StreamTask.java:383) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:383) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:475) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:550) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:802) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:697) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:670) ~[kafka-streams-2.5.1.jar:na]
2020-12-06 21:55:39.930 ERROR 141897 --- [-StreamThread-1] o.a.k.s.p.internals.StreamThread : stream-thread [kafka-join-example-4aa10729-e5b7-4a33-89c6-906dd8ab2e5d-StreamThread-1] Encountered the following unexpected Kafka exception during processing, this usually indicate Streams internal errors:
org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=1_4, processor=KSTREAM-SOURCE-0000000001, topic=kp.department, partition=4, offset=1, stacktrace=org.apache.kafka.common.errors.SerializationException: java.lang.InstantiationException: No target type provided
Caused by: java.lang.InstantiationException: No target type provided
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:45)
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:1)
at org.apache.kafka.streams.state.StateSerdes.valueFrom(StateSerdes.java:160)
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:56)
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:26)
at org.apache.kafka.streams.kstream.internals.KStreamKStreamJoin$KStreamKStreamJoinProcessor.process(KStreamKStreamJoin.java:100)
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:55)
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:104)
at org.apache.kafka.streams.processor.internals.StreamTask.lambda$process$3(StreamTask.java:383)
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:383)
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:475)
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:550)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:802)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:697)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:670)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:400) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:475) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:550) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:802) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:697) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:670) ~[kafka-streams-2.5.1.jar:na]
Caused by: org.apache.kafka.common.errors.SerializationException: java.lang.InstantiationException: No target type provided
Caused by: java.lang.InstantiationException: No target type provided
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:45) ~[classes/:na]
at io.github.kprasad99.streams.protobuf.serialization.ProtobufDeserializer.deserialize(ProtobufDeserializer.java:1) ~[classes/:na]
at org.apache.kafka.streams.state.StateSerdes.valueFrom(StateSerdes.java:160) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:56) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.state.internals.MeteredWindowStoreIterator.next(MeteredWindowStoreIterator.java:26) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.kstream.internals.KStreamKStreamJoin$KStreamKStreamJoinProcessor.process(KStreamKStreamJoin.java:100) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:55) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.lambda$process$2(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:142) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:104) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamTask.lambda$process$3(StreamTask.java:383) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:806) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:383) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:475) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:550) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:802) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:697) ~[kafka-streams-2.5.1.jar:na]
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:670) ~[kafka-streams-2.5.1.jar:na]
我稍微整理了一下你的配置,应该可以用了。如果没有,请创建一个小样本应用程序并分享,然后我们可以进一步研究。
spring:
application.name: kafka-join-example
spring.kafka.bootstrap-servers: 192.168.56.101:19092
spring.cloud.stream.kafka.streams.bindings.process-in-0.consumer.valueSerde: io.github.kprasad99.streams.AppSerdes$EmployeeSerde
spring.cloud.stream.kafka.streams.bindings.process-in-1.consumer.valueSerde: io.github.kprasad99.streams.AppSerdes$DepartmentSerde
spring.cloud.stream.kafka.streams.bindings.process-out-0.producer.valueSerde: io.github.kprasad99.streams.AppSerdes$DepartmentDataSerde
spring.cloud.stream:
bindings:
process-in-0:
destination: kp.sch
process-in-1:
destination: kp.lm
process-out-0:
destination: kp.lm.sch
kafka.streams.binder:
brokers:
- 192.168.56.101:19092
# replication-factor: 3
# required-acks: 2
min-partition-count: 5
configuration:
commit.interval.ms: 100
default:
key.serde: org.apache.kafka.common.serialization.Serdes$StringSerde
value.serde: io.github.kprasad99.streams.kafka.serde.ProtobufSerde
您还可以在应用程序中定义这些bean,如下所示。
@Bean
public Serde<Department> departmentSerde() {
return new DepartmentSerde();
}
// add the other two Serde beans.
如果您在应用程序中定义了这些Serde
bean,那么您不需要配置中相应的3个value eSerde
属性,因为bean获得了优先级。
问题内容: 这是我正在使用的,可以成功设置用户代理,而无法下载首选项。 Windows 7,Chrome 26,Selenium-dotnet-2.31.2,chromedriver_win_26.0.1383.0 取自chromedriver.log: 检查临时首选项文件,位于,否,并已设置。 问题答案: Selenium dotNet驱动程序不支持开箱即用。问题在于必须在节点下进行定义。本类不
我是这方面的新手。我希望这个代码做的是:每当用户点击项目(例如:like或dislike)时,我希望我的firebase发生一些事情(例如,将like的值设置为1)。我正在努力为每个项目(喜欢,不喜欢,快乐,报告)设置一个点击监听器,但我不知道怎么做。即使我在静态类内部设置了click侦听器,也不能调用我的数据库引用。我还尝试了commentsactivity.this.mreviewsdatab
我知道Java和C#,但属性绑定我只知道C#MVVM。我试图理解JavaFX中的属性绑定,使用属性值的自定义getter和setter(就像在C#中一样)。 我创建了以下类: 我的印象是,如果我遵循推荐的JavaBean/JavaFX命名约定,那么绑定系统将足够聪明地使用反射(?)并为该属性使用自定义getter/setter。但是我的视图模型getter/setter从来没有使用过。 在C#中,
问题内容: 最近,我开始着手研究使用MySQL用PHP编写的遗留电子商务应用程序中的ElasticSearch(ES)实施。我对所有这些东西都是新手,阅读文档很好,但是我确实需要有经验的人为我提供建议。 从ES文档中,我可以设置一个新集群,并且我还发现河已被弃用,应该将其替换,因此我将它们替换为Logstash和JDBC MySQL连接器。 此时,我有: elasticsearch Logstas
我的方法正确吗?在正文中发送凭据>接收响应(成功)>登录成功>将令牌存储在会话存储中并将其设置在头中>会话过期时注销 存储jwt
编辑配置文件片段中有一个旋转器视图。 我想在旋转器中设置国籍的用户数据(从包中检索的值)。 我的尝试是: //国家数据