当前位置: 首页 > 知识库问答 >
问题:

Spring Batch无法在我的联调中加载JobBuilderFactory

鱼旺
2023-03-14

我有一个配置,可以成功运行并加载细胞系数据,并发布到细胞系主题中的各个收件人。它工作正常,但是当我尝试加载JobLauncherTestUtils和JobRepositoryTestUtils时,我得到一个错误,表明找不到JobBuilderFactory。正如您将从我的配置中看到的,我确实使用委托给Spring的Lombok加载了JobBuilderFactory和StepBuilderFactory。正如我所说,所有这些都可以正常工作,但这里的测试是测试配置yaml文件

应用测试。yml

    spring:
        sql:
         init:
            schema-locations: classpath:db/migration
            platform: derby



        jobmeta-ds:
            datasource:
              driver-class-name: org.apache.derby.jdbc.EmbeddedDriver
              url: jdbc:derby:support/jhhmeta;create=true
              password:
              jndi-name: false

       cell-datasource:
          datasource:
           driver-class-name: oracle.jdbc.driver.OracleDriver
           url: jdbc:oracle:thin:@localhost:1521:xe
           password:
           jndi-name: false

以下是数据源:

      // CellDbConfig class

       @Configuration
       public class CellDbConfig {

       @Bean
       @ConfigurationProperties("cell-datasource")
       public DataSourceProperties cellLineDataSourceProperties() {
         return new DataSourceProperties();
       }

       @Bean(name = "cellDataSource")
       public DataSource cellDataSource() {
        HikariDataSource dataSource = cellLineDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class)
            .build();
       return dataSource;
      }

       @Bean(name = "cellJdbcTemplate")
       public JdbcTemplate cellJdbcTemplate(@Qualifier("cellDataSource") DataSource cellDatataSource) {
       return new JdbcTemplate(cellDataSource);
       }
     }

以下是JobRepository数据源配置的另一个数据源

        @Configuration
        public class JobRepoMetadataDbConfig {
    
        @Primary
        @Bean
        @ConfigurationProperties("jobmeta.datasource")
        public DataSourceProperties jobMetadataProperties() {
            return new DataSourceProperties();
        }
    
        @Primary
        @Bean(name = "jobMetaDataSource")
        public DataSource dataSourceJobMeta() {
            DataSource dataSource = jobMetadataProperties().initializeDataSourceBuilder().type(BasicDataSource.class)
                    .build();
            return dataSource;
        }
    
        @Bean(name = "jobMetaJdbcTemplate")
        public JdbcTemplate jobMetaJdbcTemplate(@Qualifier("jobMetaDataSource") DataSource jobMetaDataSource) {
            return new JdbcTemplate(jobMetaDataSource);
        }
    
    }

以下是Spring批处理的特定配置,即JobRepository等。

       @Configuration
       @EnableBatchProcessing
       @RequiredArgsConstructor
       public class JobRepoConfig {
    
        @Qualifier("jobMetaDataSource")
        final DataSource jobMetaDataSource;
    
        @Bean
        AbstractPlatformTransactionManager jobTransactionManager() {
            return new ResourcelessTransactionManager();
        }
    
    
        @Bean
        public JobRepositoryFactoryBean jobRepositoryFactory() throws Exception {
            JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
            factory.setDataSource(jobMetaDataSource);
            factory.setTransactionManager(jobTransactionManager());
            factory.afterPropertiesSet();
            return factory;
        }

    
        @Bean
        public JobRepository jobRepository() throws Exception {
            JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
            jobRepositoryFactoryBean.setDataSource(jobMetaDataSource);
            jobRepositoryFactoryBean.setTransactionManager(jobTransactionManager());
            jobRepositoryFactoryBean.setDatabaseType(DatabaseType.H2.getProductName());
            return jobRepositoryFactoryBean.getObject();
        }
    
        @Bean
        public SimpleJobLauncher launchAppJobLauncher() throws Exception{
            SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
            simpleJobLauncher.setJobRepository(jobRepository());
            return simpleJobLauncher;
        }
    
    }

这是发布细胞系数据的KafkaProducer配置

        @Configuration
        @Slf4j
        public class ProducerConfig {
    
        @Value("${spring.kafka.template.default-topic}")
        private String cellsTopic;
    
        @Bean
        public ProducerFactory<Long, CellVO> kafkaProducerFactory(KafkaProperties kafkaProperties) {
            var producerProperties = kafkaProperties.buildProducerProperties();
    
            var sslProperties = kafkaProperties.getSsl().buildProperties();
    
    
            Map<String, Object> props = new HashMap<>(producerProperties);
            if (!CollectionUtils.isEmpty(sslProperties)) {
                props.putAll(sslProperties);
            }
    
            return new DefaultKafkaProducerFactory<>(props);
        }
    
        @Bean
        public KafkaTemplate<Long, CellVO> kafkaTemplate(ProducerFactory<Long, CellVO> kafkaProducerFactory) {
            KafkaTemplate<Long, CellVO> kafkaTemplate = new KafkaTemplate<Long, CellVO>(kafkaProducerFactory);
            kafkaTemplate.setDefaultTopic(cellsTopic);
            return kafkaTemplate;
        }
    }

这是Spring Batch测试类:

        @SpringBatchTest
        @SpringBootTest
        @ActiveProfiles("test")
        @Tag("integration")
        @EnableAutoConfiguration
        public class CellCongTest {
    
    
        @Autowired
        private JobLauncherTestUtils jobLauncherTestUtils;
    
    
        @Autowired
        private JobRepositoryTestUtils jobRepositoryTestUtils;
    
    
        @Test
        public void testSuccessfulLoad() throws Exception {
    
        }
    
      }

最后是批处理作业本身:

    @Configuration
    @EnableScheduling
    @RequiredArgsConstructor
    @Slf4j
    public class CellBatchJobConfig {
    
        final JobBuilderFactory jobBuilderFactory;
        final JobLauncher jobAppJobLauncher;
        final StepBuilderFactory stepBuilderFactory;
        final KafkaTemplate<Long, CellVO> kafkaTemplate;
        final KafkaItemWriteListener kafkaItemWriteListener;
        final static String CELL_LINE_JOB = "CELL_LINE_JOB";
    
    
        @Value("${chunk-size}")
        private int chunkSize;
    
        @Qualifier("cellDataSource")
        final DataSource cellDataSource;
    
    
        @Bean
        public JdbcPagingItemReader<CellVO> cellDataReader(
                PagingQueryProvider pagingQueryProvider) {
            return new JdbcPagingItemReaderBuilder<CellVO>()
                    .name("cellDataReader")
                    .dataSource(cellDataSource)
                    .queryProvider(pagingQueryProvider)
                    .pageSize(chunkSize)
                    .rowMapper(new CellRowMapper())
                    .build();
        }
    
        @Bean
        public PagingQueryProvider pagingQueryProvider() {
            OraclePagingQueryProvider pagingQueryProvider = new OraclePagingQueryProvider();
            final Map<String, Order> sortKeys = new HashMap<>();
            sortKeys.put("CELL_ID", Order.ASCENDING);
            pagingQueryProvider.setSortKeys(sortKeys);
            pagingQueryProvider.setSelectClause(" CELL_ID, CELL_TYPE, SITE, CELL_QUALITY_LINE ");
            pagingQueryProvider.setFromClause(" FROM DCV.CELL_LINES");
            return pagingQueryProvider;
        }
    
    
        @Bean
        public KafkaItemWriter<Long, CellVO> kafkaItemWriter() throws Exception {
            KafkaItemWriter<Long, CellVO> kafkaItemWriter = new KafkaItemWriter<>();
            kafkaItemWriter.setKafkaTemplate(kafkaTemplate);
            kafkaItemWriter.setItemKeyMapper(CellVO::getLocationId);
            kafkaItemWriter.setDelete(false);
            kafkaItemWriter.afterPropertiesSet();
            return kafkaItemWriter;
        }
    
    
        @Bean
        public Step loadCellLines() throws Exception {
            return stepBuilderFactory.get("step1")
                    .<CellVO, CellVO>chunk(chunkSize)
                    .reader(cellDataReader(pagingQueryProvider()))
                    .writer(kafkaItemWriter())
                    .listener(kafkaItemWriteListener)
                    .build();
        }
    
    
        @Bean
        public Job cellLineJob() throws Exception {
            return jobBuilderFactory.get(CELL_LINE_JOB)
                    .incrementer(new RunIdIncrementer())
                    .start(loadCellLines())
                    .build();
        }
    
        @Bean("jobParameters")
        JobParameters jobParameters() {
            JobParameters jobParameters = new JobParametersBuilder()
                    .addString("jobId", UUID.randomUUID().toString())
                    .addDate("date", new Date())
                    .addLong("time", System.currentTimeMillis()).toJobParameters();
            return jobParameters;
        }
    
    
       @Scheduled(cron = "0 0 5 * * *")
        public Job runCellLineJob() throws Exception {
            kafkaItemWriteListener.setItems(new ArrayList<>());
           return jobBuilderFactory.get(CELL_LINE_JOB)
                   .incrementer(new RunIdIncrementer())
                   .start(loadCellLines())
                   .build();
        }
    
    }

不幸的是,测试失败,并显示无法加载应用程序上下文的消息:

错误如下:

Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'jobLauncherTestUtils':

通过方法“setJob”参数0表示的未满足的依赖关系;嵌套的异常是org。springframework。豆。工厂UnsatisfiedPendencyException:创建名为“cellBatchJobConfig”的bean时出错:通过构造函数参数0表示的未满足依赖项;嵌套的异常是org。springframework。豆。工厂NoSuchBean定义异常:没有“org”类型的合格bean。springframework。一批果心配置注释。JobBuilderFactory可用:至少需要1个符合autowire候选资格的bean。依赖项注释:{}

我尝试的一件事是手动注入作业但这并不起作用我甚至不知道为什么它应该能够找到作业如果它可以找到作业在实际配置但不是在测试中

@Configuration
class JobLaunchUtilsCellLine {
  
  @Autowired
  @Qualifier("cellLineJob")
  Job cellLineJob;
  
  @Bean
  public JobLauncherTestUtils cellLineJobLauncherUtils(){
      JobLauncherTestUtils jobLauncherTestUtils = new JobLauncherTestUtils();
      jobLauncherTestUtils.setJob(cellLineJob);
      return jobLauncherTestUtils;
  }        

}

然后我在Spring批量测试中这样注入它,但它不起作用:

 @Qualifier("cellLineJobLauncherUtils")
 @Autowired
 JobLauncherTestUtils cellLineJobLauncherUtils;

但是,它仍然抱怨JobBuilderFactory bean不存在

共有1个答案

祁承嗣
2023-03-14

我们在添加新的计划作业配置时遇到了相同的问题

如何解决:

  1. 创建JobLaunchUtils(类似于您的)
import org.springframework.batch.test.JobLauncherTestUtils
import org.springframework.batch.test.JobRepositoryTestUtils
import org.springframework.context.annotation.Bean

class JobSpecConfiguration {
    @Bean
    JobLauncherTestUtils getJobLauncherTestUtils() {
        new JobLauncherTestUtils()
    }

    @Bean
    JobRepositoryTestUtils getJobRepositoryTestUtils() {
        new JobRepositoryTestUtils()
    }
}
import org.spockframework.spring.SpringBean
import org.springframework.batch.core.Job
import org.springframework.batch.test.JobLauncherTestUtils
import org.springframework.batch.test.JobRepositoryTestUtils
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.beans.factory.annotation.Qualifier
import org.springframework.boot.test.context.TestConfiguration
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Primary
import org.springframework.test.annotation.DirtiesContext
import org.springframework.test.context.ActiveProfiles
import org.springframework.test.context.ContextConfiguration
import spock.lang.Specification

@DirtiesContext
@ContextConfiguration(classes = [Application, TestConfig])
@ActiveProfiles(['test', 'kafka'])
class SubmitFilesJobSpec extends Specification {

    @Autowired
    private JobLauncherTestUtils jobLauncherTestUtils

    @Autowired
    private JobRepositoryTestUtils jobRepositoryTestUtils

    @SpringBean
    private SomeRepo someRepo = Mock()

    def cleanup() {
        jobRepositoryTestUtils.removeJobExecutions()
    }

    //some unit test that we have
    def "Verify batch run"() {
        given: "At least 1 Open Record"
        def record = defaultData()
        someRepo.findTop1ByStatus(_) >> record

        when: "A batch job has been triggered"
        def jobExecution = jobLauncherTestUtils.launchJob(BaseJobExecution.getJobParameters(null))

        then: "Job will be completed with at least 1 persisted/processed record"
        2 == jobExecution.getStepExecutions().size()
        jobExecution.getStepExecutions().forEach(stepExecution -> {
            1 == stepExecution.getWriteCount()
        })
        "SOME_JOB_NAME" == jobExecution.getJobInstance().getJobName()
        "COMPLETED" == jobExecution.getExitStatus().getExitCode()
    }

    @TestConfiguration
    static class TestConfig extends JobSpecConfiguration {

        @Override
        @Bean
        JobLauncherTestUtils getJobLauncherTestUtils() {
            new JobLauncherTestUtils() {
                @Override
                @Autowired
                void setJob(@Qualifier("submitFilesJob") Job job) {
                    super.setJob(job)
                }
            }
        }
    }

 类似资料:
  • 这是我的密码。我尝试了所有的方法,但它总是给我这个错误:

  • 问题内容: 我测试了此代码以创建带有图像的对话框。 我将图像文件放入目录中。但是由于某些原因,图像无法显示。你能帮我纠正我的错误吗? 问题答案: 只需替换以下代码: 有了这个 Docu参考。 https://docs.oracle.com/javase/8/javafx/api/javafx/scene/image/Image.html 当您将a传递给该类时,可以用 四种不同的方式 处理( 从do

  • 问题内容: 我是spring框架工作和spring boot的新手。我正在尝试使用CSS,javascript,js添加静态html文件。文件结构是 和我的html文件头看起来像这样 当我运行spring项目时,仅显示内容且未应用CSS。然后浏览器在控制台中显示以下错误 .css,.js文件的404 Not Found错误 有人帮助我解决了这个问题。 问题答案: 你需要放入CSS 。此更改为我解决

  • 问题内容: 我正在尝试在在线考试中实现STRUTS SPRING和HIBERNATE INTEGRATION。使用apache tomcat 7.0.42在Eclipse Kepler中运行项目时,抛出以下错误 在控制台日志中,出现以下内容, struts.xml 请帮我朋友。我不知道为什么会出现。无论如何在此先感谢… !!! 问题答案: 我认为您缺少“ struts2-spring-plugin

  • 为了创建带有图像的对话框,我测试了这段代码。 我将图像文件放入目录。但是由于某种原因,图像不显示。你能帮我改正错误吗

  • 我不能加载imageview到网格视图所有代码都是真的,但我不能加载更多5个图像 这里是添加新图像时显示错误