在java中运行flink scala REPL脚本时无法编译。
我尝试了这段java代码来运行Flink scala REPL进行测试,bug总是异常。
Settings settings = new Settings(); ((MutableSettings.BooleanSetting) settings.usejavacp()).value_$eq(true); IMain main = new IMain(settings, new PrintWriter(System.out)); // Thread.currentThread().setContextClassLoader(main.classLoader()); for (String imp : imports) { main.interpret(MessageFormat.format("import {0}", imp)); } ExecutionEnvironment env = ExecutionEnvironment.createLocalEnvironment(); String script = FileUtils.readFileToString(new File("/opt/project/security-detection/sappo/src/sappo-interpreter/src/test/resources/demo.txt"), StandardCharsets.UTF_8); main.bind(new NamedParamClass("env", ExecutionEnvironment.class.getName(), env)); main.interpret(script);
scala文字
val text = env.fromElements("Who's there?", "I think I hear them. Stand, ho! Who's there?") // result 1 val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } } map { (_, 1) } groupBy(0) sum(1) counts.print() // result 2 val counts = text.map((x:String) => 1) counts.print() // result 3 text.print()
结果1
import org.apache.flink.core.fs._ import org.apache.flink.core.fs.local._ import org.apache.flink.api.common.io._ import org.apache.flink.api.common.aggregators._ import org.apache.flink.api.common.accumulators._ import org.apache.flink.api.common.distributions._ import org.apache.flink.api.common.operators._ import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint import org.apache.flink.api.common.functions._ import org.apache.flink.api.java.io._ import org.apache.flink.api.java.aggregation._ import org.apache.flink.api.java.functions._ import org.apache.flink.api.java.operators._ import org.apache.flink.api.java.sampling._ import org.apache.flink.api.scala._ import org.apache.flink.api.scala.utils._ import org.apache.flink.streaming.api.scala._ import org.apache.flink.streaming.api.windowing.time._ env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : ee335d29eefca69ee5fe7279414fc534 console:67: error: missing parameter type for expanded function ((x$1) => x$1.toLowerCase.split("\\W+").filter(((x$2) => x$2.nonEmpty))) val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } } map { (_, 1) } groupBy(0) sum(1)
结果2
import org.apache.flink.core.fs._ import org.apache.flink.core.fs.local._ import org.apache.flink.api.common.io._ import org.apache.flink.api.common.aggregators._ import org.apache.flink.api.common.accumulators._ import org.apache.flink.api.common.distributions._ import org.apache.flink.api.common.operators._ import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint import org.apache.flink.api.common.functions._ import org.apache.flink.api.java.io._ import org.apache.flink.api.java.aggregation._ import org.apache.flink.api.java.functions._ import org.apache.flink.api.java.operators._ import org.apache.flink.api.java.sampling._ import org.apache.flink.api.scala._ import org.apache.flink.api.scala.utils._ import org.apache.flink.streaming.api.scala._ import org.apache.flink.streaming.api.windowing.time._ env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : 5cbf8e476ebf32fd8fdf91766bd40af0 console:71: error: type mismatch; found : String => Int required: org.apache.flink.api.common.functions.MapFunction[String,?] val counts = text.map((x:String) => 1)
结果3
import org.apache.flink.core.fs._ import org.apache.flink.core.fs.local._ import org.apache.flink.api.common.io._ import org.apache.flink.api.common.aggregators._ import org.apache.flink.api.common.accumulators._ import org.apache.flink.api.common.distributions._ import org.apache.flink.api.common.operators._ import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint import org.apache.flink.api.common.functions._ import org.apache.flink.api.java.io._ import org.apache.flink.api.java.aggregation._ import org.apache.flink.api.java.functions._ import org.apache.flink.api.java.operators._ import org.apache.flink.api.java.sampling._ import org.apache.flink.api.scala._ import org.apache.flink.api.scala.utils._ import org.apache.flink.streaming.api.scala._ import org.apache.flink.streaming.api.windowing.time._ env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : ee335d29eefca69ee5fe7279414fc534 Who's there? I think I hear them. Stand, ho! Who's there? text: org.apache.flink.api.java.operators.DataSource[String] = org.apache.flink.api.java.operators.DataSource@53e28097 PASSED: testIMain PASSED: testIMainScript
尝试使用Flink附带的Scala REPL:
$ bin/start-scala-shell.sh local
我尝试了您共享的三个示例(使用Flink 1.7.0),它们都很好地工作。
当我运行flink时,java中的scala REPL脚本无法编译。
我最终得到了一个500的错误,这取决于stacktrace,这可能是一个数据格式错误,但我所有的变量都是字符串,所以我根本看不到错误在哪里。此外,我的连接jdbc很好,因为在调试模式下,我可以看到数据库的值,但一旦它进入JSP中,它就不再工作了。你能帮帮我吗? servlet JSP 豆类 BDD连接 堆栈跟踪 我的servlet:
问题内容: 当Maven项目在Jenkins上运行时,为什么在测试案例中出现错误“由于缺少依赖项而无法模拟类*”? 问题答案: 当我将JMockit版本从1.7更新到1.11时,此错误已解决。
如果数组中的前4个元素之一是9,则该方法应该返回true。数组长度可能小于4。在我的方法中,由于某种原因,我不断得到“缺少返回语句错误”。
在TeamCity中运行MSpec构建步骤时,我在TeamCity日志中发现以下错误: 我已经对和MSpec config文件,但它似乎没有帮助。 MSpec运行程序正在本地引用构建代理上的编译DLL,该构建代理在Visual Studio中本地正确运行。有人能解释这个错误的含义以及我如何解决它吗?
错误:第 1 行的解析错误:函数搜索(sour ^ 期望“字符串”、“数字”、“空”、“真”、“假”、“{”、“[”,得到“未定义” 代码: