我正在开发Spring Boot V2.2.6.Release+Spring Batch
。在本例中,我希望读取csv文件,将数据加载到mysql中,并将spring批处理元数据表加载到Postgres数据库中。
但这给了我一个错误:
java.lang.IllegalStateException: Failed to execute CommandLineRunner
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:787) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:768) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:322) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1226) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1215) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at com.example.SpringBatchMetadataInDiffSchemaApplication.main(SpringBatchMetadataInDiffSchemaApplication.java:27) [classes/:na]
Caused by: org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ?]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist
Position: 39
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:235) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1443) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:633) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:669) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:700) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:712) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:768) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.batch.core.repository.dao.JdbcJobInstanceDao.getJobInstance(JdbcJobInstanceDao.java:151) ~[spring-batch-core-4.2.1.RELEASE.jar:4.2.1.RELEASE]
at org.springframework.batch.core.repository.support.SimpleJobRepository.getLastJobExecution(SimpleJobRepository.java:271) ~[spring-batch-core-4.2.1.RELEASE.jar:4.2.1.RELEASE]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171]
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:366) ~[spring-tx-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:99) ~[spring-tx-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at com.sun.proxy.$Proxy61.getLastJobExecution(Unknown Source) ~[na:na]
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:104) ~[spring-batch-core-4.2.1.RELEASE.jar:4.2.1.RELEASE]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171]
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration$PassthruAdvice.invoke(SimpleBatchConfiguration.java:127) ~[spring-batch-core-4.2.1.RELEASE.jar:4.2.1.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212) ~[spring-aop-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at com.sun.proxy.$Proxy58.run(Unknown Source) ~[na:na]
at com.example.SpringBatchMetadataInDiffSchemaApplication.run(SpringBatchMetadataInDiffSchemaApplication.java:36) [classes/:na]
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:784) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
... 5 common frames omitted
Caused by: org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist
Position: 39
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2497) ~[postgresql-42.2.8.jar:42.2.8]
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2233) ~[postgresql-42.2.8.jar:42.2.8]
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:310) ~[postgresql-42.2.8.jar:42.2.8]
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:446) ~[postgresql-42.2.8.jar:42.2.8]
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:370) ~[postgresql-42.2.8.jar:42.2.8]
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:149) ~[postgresql-42.2.8.jar:42.2.8]
at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:108) ~[postgresql-42.2.8.jar:42.2.8]
at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeQuery(ProxyPreparedStatement.java:52) ~[HikariCP-3.4.1.jar:na]
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeQuery(HikariProxyPreparedStatement.java) ~[HikariCP-3.4.1.jar:na]
at org.springframework.jdbc.core.JdbcTemplate$1.doInPreparedStatement(JdbcTemplate.java:678) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:617) ~[spring-jdbc-5.2.2.RELEASE.jar:5.2.2.RELEASE]
... 37 common frames omitted
@Configuration
public class JobConfig {
@Autowired
private StepBuilderFactory stepBuilderFactory;
@Autowired
private JobBuilderFactory jobBuilderFactory;
@Qualifier("secondaryDS")
@Autowired
private DataSource dataSource;
@Bean
public FlatFileItemReader<Customer> customerItemReader(){
FlatFileItemReader<Customer> reader = new FlatFileItemReader<>();
reader.setLinesToSkip(1);
reader.setResource(new ClassPathResource("/data/customer.csv"));
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
tokenizer.setNames(new String[] {"id", "firstName", "lastName", "birthdate"});
DefaultLineMapper<Customer> customerLineMapper = new DefaultLineMapper<>();
customerLineMapper.setLineTokenizer(tokenizer);
customerLineMapper.setFieldSetMapper(new CustomerFieldSetMapper());
customerLineMapper.afterPropertiesSet();
reader.setLineMapper(customerLineMapper);
return reader;
}
@Bean
public JdbcBatchItemWriter<Customer> customerItemWriter(){
JdbcBatchItemWriter<Customer> writer = new JdbcBatchItemWriter<>();
writer.setDataSource(this.dataSource);
writer.setSql("INSERT INTO CUSTOMER VALUES (:id, :firstName, :lastName, :birthdate)");
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>());
writer.afterPropertiesSet();
return writer;
}
@Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Customer, Customer> chunk(1000)
.reader(customerItemReader())
.writer(customerItemWriter())
.build();
}
@Bean
public Job job() {
return jobBuilderFactory.get("job")
.start(step1())
.build();
}
}
@Configuration
public class DatabaseConfig {
@Autowired
private Environment env;
// For Test schema
@Bean(name="secondaryDS")
@ConfigurationProperties(prefix="spring.datasource")
public DataSource getSeconadaryBatchDataSource(){
return DataSourceBuilder.create()
.url(env.getProperty("spring.datasource.url"))
.driverClassName(env.getProperty("spring.datasource.driver-class-name"))
.username(env.getProperty("spring.datasource.username"))
.password(env.getProperty("spring.datasource.password"))
.build();
}
// For "batchmetadata" tables
@Bean(name="primaryDS")
@Primary
@ConfigurationProperties(prefix="spring.hello.datasource")
public DataSource getPrimaryBatchDataSource(){
return DataSourceBuilder.create()
.url(env.getProperty("spring.hello.datasource.url"))
.driverClassName(env.getProperty("spring.hello.datasource.driver-class-name"))
.username(env.getProperty("spring.hello.datasource.username"))
.password(env.getProperty("spring.hello.datasource.password"))
.build();
}
//Ref: https://stackoverflow.com/questions/28275448/multiple-data-source-and-schema-creation-in-spring-boot
@Bean(name = "primaryEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean primaryEntityManagerFactory( EntityManagerFactoryBuilder builder) {
Map<String, Object> properties = new HashMap<>();
properties.put("hibernate.hbm2ddl.auto", "update");
return builder
.dataSource(getSeconadaryBatchDataSource())
.packages("com.example.model")
.persistenceUnit("default")
.properties(properties)
.build();
}
/*@Bean(name = "secondaryEntityManagerFactory")
@Primary
public LocalContainerEntityManagerFactoryBean secondaryEntityManagerFactory(
EntityManagerFactoryBuilder builder) {
Map<String, Object> properties = new HashMap<String, Object>();
properties.put("hibernate.hbm2ddl.auto", "create");
return builder
.dataSource(getPrimaryBatchDataSource())
.packages("com.example.model")
.persistenceUnit("default")
.properties(properties)
.build();
}*/
}
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.url=jdbc:mysql://localhost:3306/test
spring.datasource.username=root
spring.datasource.password=root
spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.MySQL5Dialect
spring.jpa.hibernate.ddl-auto=create
spring.jpa.show-sql=true
spring.jpa.properties.hibernate.format_sql=true
# Postgres
spring.hello.datasource.driver-class-name=org.postgresql.Driver
spring.hello.datasource.url=jdbc:postgresql://localhost:5432/program?currentSchema=BATCH
spring.hello.datasource.username=postgres
spring.hello.datasource.password=admin
#spring.batch.table-prefix=batchmetadata.BATCH_
spring.batch.initialize-schema=always
spring.batch.job.enabled=false
spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl
spring.jpa.hibernate.naming.implicit-strategy=org.hibernate.boot.model.naming.ImplicitNamingStrategyLegacyJpaImpl
也许您正在使用“batch_job_instance”之类的引号创建表。另外,schema是在datasource.url
中指定的,因为在查询中不使用schema名称。
我无法将数据加载到表中。我有类,其名称为、等。我想将、插入到TextField上的表播放器中。 我正在执行与下面所示完全相同的操作:http://docs.oracle.com/javase/8/javafx/user-interface-tutorial/table-view.htm#cjagaaee 但我不能让它起作用。有人能帮我吗?
我试图通过Kafka Sink连接器将数据加载到Postgres表,但我收到以下错误: 原因:组织。阿帕奇。Kafka。连接错误。ConnectException:无法更改以添加缺少的字段SinkRecordField{schema=schema{STRING},name='A_ABBREV',isPrimaryKey=false},因为它不是可选的,也没有默认值 Postgres DB中的表已经
用例是读取一个文件并在其上创建一个数据帧。之后,获取该文件的架构并存储到DB表中。 例如,我只是创建一个case类并获取printschema,但是我无法从中创建数据帧 下面是一个示例代码 现在dfSchema是一个结构类型,并希望将其转换为两列的数据帧,如何实现
问题内容: 我需要从多个JSON文件中加载数据,每个文件中都有多个记录到Postgres表中。我正在使用以下代码,但无法正常工作(在Windows上使用pgAdmin III) SAMPLE.JSON文件的内容是这样的(从许多这样的记录中得到两个记录): 问题答案: 试试这个:
其次,是否需要从Graphene DB实例访问csv导入文件,或者这是客户端关心的问题?
我已经把Kafka和博士后的成绩记录下来了。我使用JDBC接收器连接器将数据从Kafka主题加载到Postgres表。首先,我用“AVRO”值格式创建一个主题和一个主题上方的流。 以下是创建接收器连接器的代码: 然后,我使用命令检查Postgres是否有来自Kafka的数据,它返回以下信息: