当前位置: 首页 > 知识库问答 >
问题:

无法从google云数据流连接Oracle server

阎京
2023-03-14

我正在尝试从运行在“CST”时区的Oracle server读取数据。我的google数据流正在“us-central1”地区运行。我使用的是Apache Beam-2.3.0 JDBCIO。read()从Oracle server读取数据的方法。我能够使用“DirectRunner”从服务器连接和读取数据,但使用ojdbc8驱动程序jar在“DataflowRunner”中出现以下错误

(901b8e8f2f8a547a): java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (ORA-00604: error occurred at recursive SQL level 1
ORA-01882: timezone region not found
)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:338)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:308)
    at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
    at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
    at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:154)
    at com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:308)
    at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:264)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:133)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:113)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:100)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (ORA-00604: error occurred at recursive SQL level 1
ORA-01882: timezone region not found
)
    at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
    at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeSetup(Unknown Source)
    at com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:63)
    at com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:45)
    at com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:94)
    at com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:415)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:326)
    ... 14 more
Caused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (ORA-00604: error occurred at recursive SQL level 1
ORA-01882: timezone region not found
)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2294)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039)
    at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533)
    at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.setup(JdbcIO.java:503)
Caused by: java.sql.SQLException: ORA-00604: error occurred at recursive SQL level 1
ORA-01882: timezone region not found

    at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:494)
    at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:441)
    at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:436)
    at oracle.jdbc.driver.T4CTTIfun.processError(T4CTTIfun.java:1061)
    at oracle.jdbc.driver.T4CTTIoauthenticate.processError(T4CTTIoauthenticate.java:550)
    at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:623)
    at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:252)
    at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:499)
    at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:1279)
    at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:663)
    at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:688)
    at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:39)
    at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:691)
    at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
    at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
    at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2304)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2290)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039)
    at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533)
    at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.setup(JdbcIO.java:503)
    at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeSetup(Unknown Source)
    at com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:63)
    at com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:45)
    at com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:94)
    at com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:415)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:326)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:308)
    at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
    at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
    at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
    at com.google.cloud.dataflow.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:154)
    at com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:308)
    at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:264)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:133)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:113)
    at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:100)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

注意:在DirectRunner和DataflowRunner中使用ojdbc6驱动程序成功连接并从服务器读取数据。但我想使用ojdbc8驱动程序jar来实现它。

以下是我的JDBCIO数据源配置

DataSourceConfiguration dataSourceConfiguration = JdbcIO.DataSourceConfiguration
    .create(options.getDriverName(), options.getJdbcUrl())
    .withUsername(options.getUsername())
    .withPassword(options.getPassword())
    .withConnectionProperties("timezone=CST");

使用“DataflowRunner”中的ojdbc8驱动程序jar,是否有使此连接成功的输入?

共有1个答案

丌官皓君
2023-03-14

看起来错误(ORA-00604:递归SQL级别1发生的错误ORA-01882:未找到时区区域)是已知的,并且在Oracle JDBC驱动程序中非常常见。

当JDBC驱动程序由于某种原因无法将正确的时区ID发送到服务器时,就会发生这种情况。经过对不同来源的一些调查和搜索(包括其他Stack Overflow案例,例如这个或另一个),我找到了不同的可能解决方案,所以让我在这里总结一下:

  • 确保您使用的是最新可用版本的ojdbc8驱动程序,因为您使用的特定版本可能有问题。尝试更改到不同的版本,看看是否有效。
  • 在建立连接之前尝试将默认时区设置为您的时区,如下面的解决方案2所示。
  • 将解决方案3中的配置行添加到文件oracle/jdbc/defaultConnectionProperties.properties中。

总结的解决方案:

// Solution 2
TimeZone timeZone = TimeZone.getTimeZone("yourTimeZone");
TimeZone.setDefault(timeZone);

// Solution 3
oracle.jdbc.timezoneAsRegion=false
 类似资料:
  • 我无法编译谷歌云数据流SDK。我在执行“mvn安装”时出错:

  • 我正在尝试使用DataFlow(Java)将数据从云存储插入到Big Query中。我可以批量上传数据;但是,我想要设置一个流式上传代替。因此,当新对象添加到我的bucket时,它们将被推送到BigQuery。 我已经将PipelineOptions设置为流,并且在GCP控制台UI中显示dataflow管道是流类型的。bucket中的初始文件/对象集被推送到BigQuery。 但是当我向桶中添加新

  • 有人能帮我做这个吗?

  • 目标是使用云数据融合连接云SQL、mysql或postgreSQL实例。 null

  • 我一直在研究App Engine和Cloud SQL。 最近我禁用了数据库的公共访问。此后,我无法使用我的应用引擎连接到CloudSQL(应用引擎和CloudSQL都在同一个Google云项目中)。 我能够连接的唯一方式是在CloudSQL上将应用引擎IP列入白名单。但在新的部署中,IP会发生变化,因此添加静态IP并不是理想的解决方案。 在云SQL连接选项卡中,它显示以下内容: 应用引擎授权< b

  • 有人从Wavemaker Online成功连接到谷歌云SQL吗?我已经通过聊天确认了这是可能的,但是链接的文档没有直接提到这一点。 当我尝试测试连接时,会收到错误消息: 连接到数据库时出错:“连接超时”,有关详细信息,请查看服务器日志 为了连接,我做了以下操作: 导入 服务名称:CloudSql 端口:3306 连接URL: jdbc: mysql://{IPv4地址} Java包:{默认} 驾驶