当前位置: 首页 > 知识库问答 >
问题:

我无法用CSV文件中的值填充H2数据库

汪辰阳
2023-03-14

我试图用CSV文件中的值填充H2数据库,如下所示:

@Component
public class DBWriterOrder implements ItemWriter<OrderEntity> {

private OrderRepository orderRepository;

@Autowired
public DBWriterOrder(OrderRepository orderRepository) {
    this.orderRepository = orderRepository;
}

@Override
public void write(List<? extends OrderEntity> orders) throws Exception {
    System.out.println("Data Saved for Orders: " + orders);
    orderRepository.saveAll(orders);

     }
}
@Component
public class ProcessorOrder implements ItemProcessor<OrderEntity, OrderEntity> {

public SimpleDateFormat sdf = new SimpleDateFormat("dd-MM-yyyy");

@Override
public OrderEntity process(OrderEntity orderEntity) throws Exception {

    Date deliveryDate = sdf.parse(orderEntity.getDeliveryDate().toString());
    long deliveryDateInMillis = deliveryDate.getTime();
    orderEntity.setDeliveryDate(deliveryDateInMillis);

    Date lastUpdated = sdf.parse(orderEntity.getLastUpdated().toString());
    long lastUpdatedInMillis = lastUpdated.getTime();
    orderEntity.setLastUpdated(lastUpdatedInMillis);


    return orderEntity;
    }
}
@Configuration
@EnableBatchProcessing
public class SpringBatchConfigOrder {

@Bean
public Job jobOrder(JobBuilderFactory jobBuilderFactory,
               StepBuilderFactory stepBuilderFactory,
               ItemReader<OrderEntity> itemReader,
               ItemProcessor<OrderEntity, OrderEntity> itemProcessor,
               ItemWriter<OrderEntity> itemWriter
) {

    Step step = stepBuilderFactory.get("ETL-file-load")
            .<OrderEntity, OrderEntity>chunk(100)
            .reader(itemReader)
            .processor(itemProcessor)
            .writer(itemWriter)
            .build();


    return jobBuilderFactory.get("ETL-Load")
            .incrementer(new RunIdIncrementer())
            .start(step)
            .build();
}

@Bean
public FlatFileItemReader<OrderEntity> itemReaderOrder() {

    FlatFileItemReader<OrderEntity> flatFileItemReader = new FlatFileItemReader<>();
    flatFileItemReader.setResource(new FileSystemResource("src/main/resources/orders.csv"));
    flatFileItemReader.setName("CSV-Reader");
    flatFileItemReader.setLinesToSkip(1);
    flatFileItemReader.setLineMapper(lineMapperOrder());
    return flatFileItemReader;
}

@Bean
public LineMapper<OrderEntity> lineMapperOrder() {

    DefaultLineMapper<OrderEntity> defaultLineMapper = new DefaultLineMapper<>();
    DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();

    lineTokenizer.setDelimiter(",");
    lineTokenizer.setStrict(false);
    lineTokenizer.setNames("id","destination","deliveryDate","statusOrder","lastUpdated");

    BeanWrapperFieldSetMapper<OrderEntity> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
    fieldSetMapper.setTargetType(OrderEntity.class);

    defaultLineMapper.setLineTokenizer(lineTokenizer);
    defaultLineMapper.setFieldSetMapper(fieldSetMapper);

    return defaultLineMapper;
   }

}
@RestController
@RequestMapping("/loadOrder")
public class OrderLoadController {

@Autowired
JobLauncher jobLauncherOrder;

@Autowired
Job jobOrder;

@GetMapping
public BatchStatus load() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {


    Map<String, JobParameter> maps = new HashMap<>();
    maps.put("time", new JobParameter(System.currentTimeMillis()));
    JobParameters parameters = new JobParameters(maps);
    JobExecution jobExecution = jobLauncherOrder.run(jobOrder, parameters);

    System.out.println("JobExecution: " + jobExecution.getStatus());

    System.out.println("Batch is Running...");
    while (jobExecution.isRunning()) {
        System.out.println("...");
    }

    return jobExecution.getStatus();
    }
}

这也是我的实体类:

@Entity(name = "orders")
@Data
public class OrderEntity {

@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;

@ManyToOne(fetch = FetchType.EAGER)
private DestinationEntity destination;

private Long deliveryDate;

@Enumerated(value = EnumType.STRING)
private OrderStatus statusOrder;

private Long lastUpdated;

}

这是我的CSV文件:

id,destination,deliveryDate,statusOrder,lastUpdated
1,Ploiesti,15-12-2021,NEW,15-12-2021
2,Ploiesti,15-12-2021,NEW,15-12-2021
3,Pitesti,15-12-2021,NEW,15-12-2021
4,Pitesti,15-12-2021,NEW,15-12-2021
5,Pitesti,15-12-2021,NEW,15-12-2021

当我调用endpointlocalhost:8082/loadController时,我的数据库没有被填充,而是保持为空,我得到的只是控制台中的以下错误:

 org.springframework.batch.item.file.FlatFileParseException: Parsing error at line: 2 in resource=[file [C:\Users\ALEX\Desktop\FinalProject\demo\src\main\resources\orders.csv]], input=[1,Ploiesti,15-12-2021,NEW,15-12-2021]
    at org.springframework.batch.item.file.FlatFileItemReader.doRead(FlatFileItemReader.java:189) ~[spring-batch-infrastructure-4.3.4.jar:4.3.4]

 Caused by: org.springframework.validation.BindException: 
org.springframework.validation.BeanPropertyBindingResult: 3 errors
 Field error in object 'target' on field 'lastUpdated': rejected value [15-12-2021]; codes [typeMismatch.target.lastUpdated,typeMismatch.lastUpdated,typeMismatch.java.lang.Long,typeMismatch]; arguments [org.springframework.context.support.DefaultMessageSourceResolvable: codes [target.lastUpdated,lastUpdated]; arguments []; default message [lastUpdated]]; default message [Failed to convert property value of type 'java.lang.String' to required type 'java.lang.Long' for property 'lastUpdated'; nested exception is java.lang.NumberFormatException: For input string: "15-12-2021"]
 Field error in object 'target' on field 'destination': rejected value [Ploiesti]; codes [typeMismatch.target.destination,typeMismatch.destination,typeMismatch.com.example.demo.destination.DestinationEntity,typeMismatch]; arguments [org.springframework.context.support.DefaultMessageSourceResolvable: codes [target.destination,destination]; arguments []; default message [destination]]; default message [Failed to convert property value of type 'java.lang.String' to required type 'com.example.demo.destination.DestinationEntity' for property 'destination'; nested exception is java.lang.IllegalStateException: Cannot convert value of type 'java.lang.String' to required type 'com.example.demo.destination.DestinationEntity' for property 'destination': no matching editors or conversion strategy found]
 Field error in object 'target' on field 'deliveryDate': rejected value [15-12-2021]; codes [typeMismatch.target.deliveryDate,typeMismatch.deliveryDate,typeMismatch.java.lang.Long,typeMismatch]; arguments [org.springframework.context.support.DefaultMessageSourceResolvable: codes [target.deliveryDate,deliveryDate]; arguments []; default message [deliveryDate]]; default message [Failed to convert property value of type 'java.lang.String' to required type 'java.lang.Long' for property 'deliveryDate'; nested exception is java.lang.NumberFormatException: For input string: "15-12-2021"]

最后,我的问题是我应该做什么,我应该怎样做才能把事情做好?

共有2个答案

蒋烨然
2023-03-14

看起来您需要一个自定义的字段集映射器,因为您不仅需要将字符串转换为日期,然后转换为,还需要按名称查找目的地。下面是一个将字符串转换为日期

public class PersonFieldSetMapper implements FieldSetMapper<Person> {

  @Override
  public Person mapFieldSet(FieldSet fieldSet) throws BindException {
    return new Person(fieldSet.readLong("id"),
            fieldSet.readString("firstName"),
            fieldSet.readString("lastName"),
            fieldSet.readDate("birthdate", "yyyy-MM-dd HH:mm:ss"));
  }
}

这里的例子https://www.dineshonjava.com/spring-batch-read-from-csv-and-write-to-relational-db/

您还需要以某种方式添加一个Map

孟鸿朗
2023-03-14

您可以像下面这样使用FieldSetMapper并从代码中替换Mapper。对每个字段使用基于索引的读取。Likeid将是0索引


public class OrderEntityMapper implements FieldSetMapper<OrderEntity> {

@Override
public OrderEntity mapFieldSet(FieldSet fieldSet) throws BindException {
    OrderEntity order=new OrderEntity();
    order.setId(fieldSet.readLong(0));
    // other fields
    return order;
}

然后替换BeanWrapperFieldSetMapper


@Bean
public LineMapper<OrderEntity> lineMapperOrder() {

    DefaultLineMapper<OrderEntity> defaultLineMapper = new DefaultLineMapper<>();
    DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();

    lineTokenizer.setDelimiter(",");
    lineTokenizer.setStrict(false);
    lineTokenizer.setNames("id","destination","deliveryDate","statusOrder","lastUpdated");

      defaultLineMapper.setLineTokenizer(lineTokenizer);
    defaultLineMapper.setFieldSetMapper(new OrderEntityMapper());

    return defaultLineMapper;
   }

}

 类似资料:
  • 此应用程序将在Person类的帮助下写入和读取csv文件。它不会用数据填充表。我试图在将csv文件加载到list=new ArrayList()的同时执行表的加载,方法是将相同的数据加载到plist=new ArrayList(),然后将此plist传输到data=FXCollections。可观察列表(plist) 请解释此代码失败的地方。是否无法在加载列表ArrayList的同时加载obser

  • 问题内容: 我有一个QTableWidget,我将此表中的数据导出到一个csv文件中。但是现在,我想打开一个现有的csv文件,并使用此数据填充我的表。我该怎么做? 这是我的导出代码,我想要一个“填充”代码,我真的不知道该怎么做。.我知道如何读取一个csv,但我不知道如何用此csv数据填充我的表。 问题答案: 看起来您可以在此处使用该模块: PyQt5版本:

  • 我有一个外部csv文件,其中包含要插入的数据。我的一个列是数据类型timestamp(但它是一个可空的列)。数据值为NULL/NULL将以以下异常结束。 我调试以查看生成的insert语句,当我进入JdbcPreparedStatement类时,CommandInterface似乎有空的“”字符串值,而不是NULL。 编辑:无意中,我试图在我的liquibase脚本中设置列的数据类型(可以为NUL

  • 我试过使用和不使用。表和已成功创建,并包含迁移(insert-books)。 迁移是通过的,因为如果我添加了一个无效的insert(到一些不存在的表中),我会得到异常: 如何使用LiquiBase用insert-books.sql脚本中的数据填充数据库?

  • 问题内容: 我是Java新手,我有一个像这样的文本文件 我想用此文本文件中的数据填充“ jTable”。下面是到目前为止我的代码不起作用。当我执行程序时,表上没有任何显示。 有人可以帮帮我吗? 问题答案: 您需要将其更改为以下内容。每次读取新行时,都需要重置矢量,否则它包含第一行+第二行+以此类推。您还可以调用以避免初始行为空。并且您只需要添加行,您的注释[单元格包含很多列]的问题是由于 使用而引

  • spring控制台加载另一个数据库 如何正确加载H2数据库?