当前位置: 首页 > 知识库问答 >
问题:

如何使用spring boot和spring Batch限制重复作业创建?

窦宏旷
2023-03-14

我用Spring Batch应用程序和调度创建了一个Spring Boot。当我只创建一个工作时,事情进展顺利。但当我试图使用模块化方法创建另一个工作时。作业和它的步骤运行了很多次,它们被重复了。

得到以下错误。

2017-08-24 16:05:00.581  INFO 16172 --- [cTaskExecutor-2] o.s.b.c.l.support.SimpleJobLauncher      : Job: [FlowJob: [name=importDeptJob]] completed with the following parameters: [{JobID1=1503579900035}] and the following status: [FAILED]
2017-08-24 16:05:00.581 ERROR 16172 --- [cTaskExecutor-2] o.s.batch.core.step.tasklet.TaskletStep  : JobRepository failure forcing rollback

org.springframework.dao.OptimisticLockingFailureException: Attempt to update step execution id=1 with wrong version (3), where current version is 1

谁能指导我如何解决这些问题,并以相互独立的并行方式运行这些作业?

以下是配置类:ModularJobConfiguration.java、DeptBatchConfiguration.java、CityBatchConfiguration.java和BatchScheduler.java

@Configuration
@EnableBatchProcessing(modular=true)
public class ModularJobConfiguration {

    @Bean
    public ApplicationContextFactory firstJob() {
        return new GenericApplicationContextFactory(DeptBatchConfiguration.class);
    }

    @Bean
    public ApplicationContextFactory secondJob() {
        return new GenericApplicationContextFactory(CityBatchConfiguration.class);
    }

}


@Configuration
@EnableBatchProcessing
@Import({BatchScheduler.class})
public class DeptBatchConfiguration {

    private static final Logger LOGGER = LoggerFactory.getLogger(DeptBatchConfiguration.class);

    @Autowired
    private SimpleJobLauncher jobLauncher;

    @Autowired
    public JobBuilderFactory jobBuilderFactory;

    @Autowired
    public StepBuilderFactory stepBuilderFactory;

    @Autowired
    public JobExecutionListener listener;

    public ItemReader<DepartmentModelReader> deptReaderSO;


    @Autowired
    @Qualifier("dataSourceReader")
    private DataSource dataSourceReader;


    @Autowired
    @Qualifier("dataSourceWriter")
    private DataSource dataSourceWriter;



    @Scheduled(cron = "0 0/1 * * * ?")
    public void performFirstJob() throws Exception {

        long startTime = System.currentTimeMillis();
        LOGGER.info("Job1 Started at :" + new Date());
        JobParameters param = new JobParametersBuilder().addString("JobID1",String.valueOf(System.currentTimeMillis())).toJobParameters();

        JobExecution execution = (JobExecution) jobLauncher.run(importDeptJob(jobBuilderFactory,stepdept(deptReaderSO,customWriter()),listener), param);

        long endTime = System.currentTimeMillis();
        LOGGER.info("Job1 finished at " + (endTime - startTime) / 1000  + " seconds with status :" + execution.getExitStatus());
    }

    @Bean
    public ItemReader<DepartmentModelReader> deptReaderSO() {
        //LOGGER.info("Inside deptReaderSO Method");
        JdbcCursorItemReader<DepartmentModelReader> deptReaderSO = new JdbcCursorItemReader<>();
        //deptReaderSO.setSql("select id, firstName, lastname, random_num from reader");
        deptReaderSO.setSql("SELECT DEPT_CODE,DEPT_NAME,FULL_DEPT_NAME,CITY_CODE,CITY_NAME,CITY_TYPE_NAME,CREATED_USER_ID,CREATED_G_DATE,MODIFIED_USER_ID,MODIFIED_G_DATE,RECORD_ACTIVITY,DEPT_CLASS,DEPT_PARENT,DEPT_PARENT_NAME FROM TBL_SAMPLE_SAFTY_DEPTS");
        deptReaderSO.setDataSource(dataSourceReader);
        deptReaderSO.setRowMapper(
                (ResultSet resultSet, int rowNum) -> {
                    if (!(resultSet.isAfterLast()) && !(resultSet.isBeforeFirst())) {
                        DepartmentModelReader recordSO = new DepartmentModelReader();
                        recordSO.setDeptCode(resultSet.getString("DEPT_CODE"));
                        recordSO.setDeptName(resultSet.getString("DEPT_NAME"));
                        recordSO.setFullDeptName(resultSet.getString("FULL_DEPT_NAME"));
                        recordSO.setCityCode(resultSet.getInt("CITY_CODE"));
                        recordSO.setCityName(resultSet.getString("CITY_NAME"));
                        recordSO.setCityTypeName(resultSet.getString("CITY_TYPE_NAME"));
                        recordSO.setCreatedUserId(resultSet.getInt("CREATED_USER_ID"));
                        recordSO.setCreatedGDate(resultSet.getDate("CREATED_G_DATE"));
                        recordSO.setModifiedUserId(resultSet.getString("MODIFIED_USER_ID"));
                        recordSO.setModifiedGDate(resultSet.getDate("MODIFIED_G_DATE"));
                        recordSO.setRecordActivity(resultSet.getInt("RECORD_ACTIVITY"));
                        recordSO.setDeptClass(resultSet.getInt("DEPT_CLASS"));
                        recordSO.setDeptParent(resultSet.getString("DEPT_PARENT"));
                        recordSO.setDeptParentName(resultSet.getString("DEPT_PARENT_NAME"));

                       // LOGGER.info("RowMapper record : {}", recordSO.getDeptCode() +" | "+recordSO.getDeptName());
                        return recordSO;
                    } else {
                        LOGGER.info("Returning null from rowMapper");
                        return null;
                    }
                });
        return deptReaderSO;
    }

    @Bean
    public ItemProcessor<DepartmentModelReader, DepartmentModelWriter> processor() {
        //LOGGER.info("Inside Processor Method");
        return new RecordProcessor();
    }

    @Bean
    public ItemWriter<DepartmentModelWriter> customWriter(){
        //LOGGER.info("Inside customWriter Method");
        return new CustomItemWriter();
    }

    @Bean
    public Job importDeptJob(JobBuilderFactory jobs, Step stepdept,JobExecutionListener listener){
        return jobs.get("importDeptJob")
                .incrementer(new RunIdIncrementer())
                .listener(listener())
                .flow(stepdept).end().build();
    }

    @Bean
    public Step stepdept(ItemReader<DepartmentModelReader> deptReaderSO,
            ItemWriter<DepartmentModelWriter> writerSO) {
        LOGGER.info("Inside stepdept Method");

        return stepBuilderFactory.get("stepdept").<DepartmentModelReader, DepartmentModelWriter>chunk(5)
                .reader(deptReaderSO).processor(processor()).writer(customWriter()).transactionManager(platformTransactionManager(dataSourceWriter)).build();
    }

    @Bean
    public JobExecutionListener listener() {
        return new JobCompletionNotificationListener();
    }

    @Bean
    public JdbcTemplate jdbcTemplate(DataSource dataSource) {
        return new JdbcTemplate(dataSource);
    }

    @Bean
    public BatchWriteService batchWriteService() {
        return new BatchWriteService();
    }

    @Bean
    public PlatformTransactionManager platformTransactionManager(@Qualifier("dataSourceWriter") DataSource dataSourceWriter) {
        JpaTransactionManager transactionManager = new JpaTransactionManager();
        transactionManager.setDataSource(dataSourceWriter);
        return transactionManager;
    }
}



@Configuration
@EnableBatchProcessing
@Import({BatchScheduler.class})
public class CityBatchConfiguration {

    private static final Logger LOGGER = LoggerFactory.getLogger(CityBatchConfiguration.class);

    @Autowired
    private SimpleJobLauncher jobLauncher;

    @Autowired
    public JobBuilderFactory jobBuilderFactory;

    @Autowired
    public StepBuilderFactory stepBuilderFactory;

    @Autowired
    public JobExecutionListener listener;

    public ItemReader<CitiesModelReader> citiesReaderSO;

    @Autowired
    @Qualifier("dataSourceReader")
    private DataSource dataSourceReader;


    @Autowired
    @Qualifier("dataSourceWriter")
    private DataSource dataSourceWriter;


    @Scheduled(cron = "0 0/1 * * * ?")
    public void performSecondJob() throws Exception {

        long startTime = System.currentTimeMillis();
        LOGGER.info("\n Job2 Started at :" + new Date());

        JobParameters param = new JobParametersBuilder().addString("JobID2",String.valueOf(System.currentTimeMillis())).toJobParameters();

        JobExecution execution = (JobExecution) jobLauncher.run(importCitiesJob(jobBuilderFactory,stepcity(citiesReaderSO,customCitiesWriter()),listener), param);

        long endTime = System.currentTimeMillis();
        LOGGER.info("Job2 finished at " + (endTime - startTime) / 1000  + " seconds with status :" + execution.getExitStatus());
    }


    @Bean
    public ItemReader<CitiesModelReader> citiesReaderSO() {
        //LOGGER.info("Inside readerSO Method");
        JdbcCursorItemReader<CitiesModelReader> readerSO = new JdbcCursorItemReader<>();
        readerSO.setSql("SELECT CITY_CODE,CITY_NAME,PARENT_CITY,CITY_TYPE,CITY_TYPE_NAME,CREATED_G_DATE,CREATED_USER_ID,MODIFIED_G_DATE,MODIFIED_USER_ID,RECORD_ACTIVITY FROM TBL_SAMPLE_SAFTY_CITIES");
        readerSO.setDataSource(dataSourceReader);
        readerSO.setRowMapper(
                (ResultSet resultSet, int rowNum) -> {
                    if (!(resultSet.isAfterLast()) && !(resultSet.isBeforeFirst())) {
                        CitiesModelReader recordSO = new CitiesModelReader();
                        recordSO.setCityCode(resultSet.getLong("CITY_CODE"));
                        recordSO.setCityName(resultSet.getString("CITY_NAME"));
                        recordSO.setParentCity(resultSet.getInt("PARENT_CITY"));
                        recordSO.setCityType(resultSet.getString("CITY_TYPE"));
                        recordSO.setCityTypeName(resultSet.getString("CITY_TYPE_NAME"));
                        recordSO.setCreatedGDate(resultSet.getDate("CREATED_G_DATE"));
                        recordSO.setCreatedUserId(resultSet.getString("CREATED_USER_ID"));
                        recordSO.setModifiedGDate(resultSet.getDate("MODIFIED_G_DATE"));
                        recordSO.setModifiedUserId(resultSet.getString("MODIFIED_USER_ID"));
                        recordSO.setRecordActivity(resultSet.getInt("RECORD_ACTIVITY"));

                        //LOGGER.info("RowMapper record : {}", recordSO.toString());
                        return recordSO;
                    } else {
                        LOGGER.info("Returning null from rowMapper");
                        return null;
                    }
                });
        return readerSO;
    }


    @Bean
    public ItemProcessor<CitiesModelReader,CitiesModelWriter> citiesProcessor() {
        //LOGGER.info("Inside Processor Method");
        return new RecordCitiesProcessor();
    }


    @Bean
    public ItemWriter<CitiesModelWriter> customCitiesWriter(){
        LOGGER.info("Inside customCitiesWriter Method");
        return new CustomCitiesWriter();
    }           

    @Bean
    public Job importCitiesJob(JobBuilderFactory jobs, Step stepcity,JobExecutionListener listener) {

        LOGGER.info("Inside importCitiesJob Method");
        return jobs.get("importCitiesJob")
                .incrementer(new RunIdIncrementer())
                .listener(listener())
                .flow(stepcity).end().build();
    }


    @Bean
    public Step stepcity(ItemReader<CitiesModelReader> readerSO,
            ItemWriter<CitiesModelWriter> writerSO) {
        LOGGER.info("Inside stepCity Method");

        return stepBuilderFactory.get("stepcity").<CitiesModelReader, CitiesModelWriter>chunk(5)
                .reader(readerSO).processor(citiesProcessor()).writer(customCitiesWriter()).transactionManager(platformTransactionManager(dataSourceWriter)).build();
    }

    @Bean
    public JobExecutionListener listener() {
        return new JobCompletionNotificationListener();
    }

    @Bean
    public JdbcTemplate jdbcTemplate(DataSource dataSource) {
        return new JdbcTemplate(dataSource);
    }

    @Bean
    public BatchWriteService batchWriteService() {
        return new BatchWriteService();
    }

    @Bean
    public PlatformTransactionManager platformTransactionManager(@Qualifier("dataSourceWriter") DataSource dataSourceWriter) {
        JpaTransactionManager transactionManager = new JpaTransactionManager();
        transactionManager.setDataSource(dataSourceWriter);
        return transactionManager;
    }
}





@Configuration
@EnableScheduling
public class BatchScheduler {

    private static final Logger LOGGER = LoggerFactory.getLogger(BatchScheduler.class);

    @Bean
    public ResourcelessTransactionManager resourcelessTransactionManager() {
        return new ResourcelessTransactionManager();
    }

    @Bean
    public MapJobRepositoryFactoryBean mapJobRepositoryFactory(
            ResourcelessTransactionManager txManager) throws Exception {

        LOGGER.info("Inside mapJobRepositoryFactory method");

        MapJobRepositoryFactoryBean factory = new 
                MapJobRepositoryFactoryBean(txManager);

        factory.afterPropertiesSet();

        return factory;
    }

    @Bean
    public JobRepository jobRepository(
            MapJobRepositoryFactoryBean factory) throws Exception {

        LOGGER.info("Inside jobRepository method");

        return factory.getObject();
    }

    @Bean
    public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {

        LOGGER.info("Inside jobLauncher method");

        SimpleJobLauncher launcher = new SimpleJobLauncher();
        launcher.setJobRepository(jobRepository);
        final SimpleAsyncTaskExecutor simpleAsyncTaskExecutor = new SimpleAsyncTaskExecutor();
        launcher.setTaskExecutor(simpleAsyncTaskExecutor);
        return launcher;
    }
}

共有1个答案

楚墨一
2023-03-14

MapJobRepositoryFactoryBean创建的基于映射的SimpleJobRepository不是线程安全的。

在Javadocs中:

一个FactoryBean,它使用非持久的内存DAO实现自动创建SimpleJobRepository。这个存储库只真正用于测试和快速原型。在这样的设置中,您可能会发现ResourceLesTransActionManager很有用(只要业务逻辑不使用关系数据库)。不适合在有拆分的多线程作业中使用,尽管在多线程步骤中使用应该是安全的。

<dependency>
    <groupId>com.h2database</groupId>
    <artifactId>h2</artifactId>
    <scope>runtime</scope>
</dependency>
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=

或者,要使用其他一些JDBC支持的JobRepository,将您选择的RDBMS的JDBC依赖项添加到您的项目中,并为它配置DataSource(在code中作为DataSourcebean,或者在application.properties中使用spring.dataSource前缀,如上所示)。Spring Boot将在创建JobRepositorybean期间自动使用这个DataSource

 类似资料:
  • 如何限制此地图,使我只搜索前3项,干净?

  • 我不熟悉使用cron job。我甚至不知道怎么写。我试着在网上搜索,但还是不太懂。我想创建一个cron作业,它每分钟都会执行我的代码。我正在使用PHP创建它。它不起作用。 示例 run.php(每分钟都会执行的代码) cron.php 假设这两个文件在同一个文件夹中。 是我做错的代码吗?如果有问题,请告诉我如何解决。

  • 用例:步骤1:ItemReader:从数据库中读取1000个ItemProcessor块中的数据:处理这些数据。ItemWriter:将数据写入地图,以便下一步使用 步骤2:ItemReader:读取地图ItemProcessor:处理地图数据并获取新对象。ItemWriter:将新的进程对象持久化到数据库中。 现在我希望Map在整个作业中保持不变,目前我已经为Map创建了一个不同的POJO类,并

  • 我正在运行一个火花作业,我在spark-defaults.sh.设置了以下配置,我在名称节点中有以下更改。我有1个数据节点。我正在处理2GB的数据。 但我得到一个错误,说GC限制超过。 这是我正在编写的代码。 我甚至尝试了GroupByKey而不是也。但是我得到了同样的错误。但是,当我试图删除还原ByKey或GroupByKey我得到的输出。有人能帮我解决这个错误吗? 我是否也应该在hadoop中

  • 问题内容: 我有如下表 如何通过下面的AND组合成一个IN语句来使用Restriction.in查询? 问题答案: 我认为这是您要使用的条件组合(顺便说一句,帮助Hibernate实体bean定义而不是表结构更容易):