我在com上遵循了教程的说明。谷歌。代码实验室。mlkit。
而不是使用意图。我正在使用FileProvider获取和分析完整的数据。
我替换了基于Android模拟器相机自定义图像中推荐的仿真中的虚拟图像。
所以我启动我的应用程序(主要活动),用内置摄像头拍照(这是被替换的虚拟图像),然后回到主要活动。我遇到了代码recognizeTextFromImage(),但从未遇到过。addOnSuccessListener()和。addOnFailureListener()。这让我感到惊讶,因为我甚至没有失败。日志中没有打印任何内容。
我使用API级别23,因为如果我使用更高的API,结果代码是0(而不是-1)。
我的问题是:为什么我的代码没有遇到。addOnSuccessListener()或至少进入。addOnFailureListener()?
1.更新
我尝试了“intent.addFlags(intent.FLAG_GRANT_READ_URI_PERMISSION)”,这改变了行为。读取文件需要几秒钟的时间。因此我认为@Danish是正确的,文件不是由摄像头创建的。但是:addOnFailureListener()也有同样的问题。可能文件太大了?或者我发送了错误的格式?日志上写着“W/e.codelab”。mlki:java验证。lang.String.com.google.代码实验室。mlkit。主要活动。RecognitizeTextFromImage(com.google.mlkit.vision.common.InputImage)耗时536.238ms“
2.更新
<?xml version="1.0" encoding="utf-8"?>
<paths xmlns:android="http://schemas.android.com/apk/res/android">
<external-files-path name="my_images" path="Pictures" />
</paths>
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.google.codelab.mlkit">
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-feature
android:name="android.hardware.camera"
android:required="true"/>
<queries>
<intent>
<action android:name="android.media.action.IMAGE_CAPTURE" />
</intent>
</queries>
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity android:name=".MainActivity"
android:screenOrientation="portrait"
android:configChanges="keyboardHidden|orientation|screenSize">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<meta-data
android:name="com.google.codelab.mlkit.vision.DEPENDENCIES"
android:value="ocr" />
<provider
android:name="androidx.core.content.FileProvider"
android:authorities="${applicationId}.fileprovider"
android:exported="false"
android:grantUriPermissions="true">
<meta-data
android:name="android.support.FILE_PROVIDER_PATHS"
android:resource="@xml/file_paths">
</meta-data>
</provider>
</application>
</manifest>
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:layoutDescription="@xml/activity_main_scene"
tools:context=".MainActivity" >
<TableLayout
android:layout_width="360dp"
android:layout_height="539dp"
android:layout_centerInParent="true">
<TableRow
android:layout_width="match_parent"
android:layout_height="match_parent">
<TableLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<TableRow
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView
android:id="@+id/textView_ServiceID"
android:layout_width="11dp"
android:layout_height="match_parent"
android:layout_weight="0.3"
android:gravity="center"
android:text="ServiceID"
android:textAlignment="viewStart" />
<EditText
android:id="@+id/editText_ServiceID"
android:layout_height="match_parent"
android:layout_weight="0.7
"
android:gravity="left"
android:inputType="text"
android:text="4711" />
</TableRow>
</TableLayout>
</TableRow>
<TableRow
android:layout_width="match_parent"
android:layout_height="match_parent">
<TableLayout
android:layout_width="359dp"
android:layout_height="match_parent">
<TableRow
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView
android:id="@+id/textView_Site"
android:layout_height="match_parent"
android:layout_weight="0.3"
android:gravity="center"
android:text="Site"
android:textAlignment="viewStart" />
<Spinner
android:id="@+id/spinner_Site"
android:layout_width="wrap_content"
android:layout_height="match_parent" />
<ImageButton
android:id="@+id/imageButton_Site"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:srcCompat="@drawable/ic_photo" />
<EditText
android:id="@+id/editText_Site"
android:layout_height="match_parent"
android:layout_weight="0.6"
android:inputType="text"
android:text="not yet reconized" />
</TableRow>
</TableLayout>
</TableRow>
<TableRow
android:layout_width="match_parent"
android:layout_height="match_parent">
<TableLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="end" >
<TableRow
android:layout_gravity="end"
android:layout_width="match_parent"
android:layout_height="match_parent" >
<ImageButton
android:id="@+id/imageButton6"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:srcCompat="@drawable/ic_send" />
<ImageButton
android:id="@+id/imageButton7"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:srcCompat="@drawable/ic_save" />
</TableRow>
</TableLayout>
</TableRow>
</TableLayout>
</RelativeLayout>
package com.google.codelab.mlkit;
public class MainActivity extends AppCompatActivity implements AdapterView.OnItemSelectedListener {
private static final String TAG = "MainActivity";
private ImageButton imageButton_Site;
private EditText editText_Site;
private Spinner spinner_Site;
private InputImage createdImage;
String recognizedText = "";
String currentPhotoPath;
static final int REQUEST_IMAGE_CAPTURE = 1;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
imageButton_Site = findViewById(R.id.imageButton_Site);
imageButton_Site.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
dispatchTakePictureIntent();
}
});
spinner_Site = findViewById(R.id.spinner_Site);
String[] items = new String[]{"Chicago", "New York"}; // has to be retrieved from server based on GPS (satellite)
ArrayAdapter<String> adapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item, items);
spinner_Site.setAdapter(adapter);
spinner_Site.setOnItemSelectedListener(this);
}
private void dispatchTakePictureIntent() {
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
// Ensure that there's a camera activity to handle the intent
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
// Create the File where the photo should go
File photoFile = null;
try {
photoFile = createImageFile();
} catch (IOException ex) {
Toast.makeText(getApplicationContext(), "Error occurred while creating the File!",Toast.LENGTH_LONG).show();
}
// Continue only if the File was successfully created
if (photoFile != null) {
Uri photoURI = FileProvider.getUriForFile(this,
BuildConfig.APPLICATION_ID + ".fileprovider",
photoFile);
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
takePictureIntent.putExtra(Intent.EXTRA_RETURN_RESULT, true);
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, photoURI);
takePictureIntent.setFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION); // new
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
}
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult( requestCode, resultCode, data);
// Log.d("OLR", "onActivityResult: "+ requestCode +" "+resultCode+" "+data);
if (requestCode == REQUEST_IMAGE_CAPTURE && resultCode == RESULT_OK) {
Bitmap bitmap=BitmapFactory.decodeFile(currentPhotoPath);
Bitmap imagebitmap=Bitmap.createBitmap(bitmap);//Just add this line and everything will work fine.I tried this code, its working like a charm.
int check = bitmap.getWidth();
InputImage inputImage = InputImage.fromBitmap(imagebitmap, 0);
String text = recognizeTextFromImage(inputImage);
if (text.isEmpty())
{
Toast.makeText(getApplicationContext(), "Nothing recognized. Please try again!",Toast.LENGTH_LONG).show();
editText_Site.setText("Failed !");
}
else {
editText_Site.setText(text);
}
}
else {
Toast.makeText(getApplicationContext(), "An issue occurred. Please inform app owner!",Toast.LENGTH_LONG).show();
}
}
private File createImageFile() throws IOException {
File storageDir = getExternalFilesDir(Environment.DIRECTORY_PICTURES);
// create directory if necessary
if (!storageDir.exists()){
storageDir.mkdir();
}
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss",Locale.GERMANY).format(new Date());
String imageFileName = "OLR_" + timeStamp + "_";
File image = File.createTempFile(
imageFileName, // prefix
".jpg", // suffix
storageDir // directory
);
// imageFile = image;
currentPhotoPath = image.getAbsolutePath();
return image;
}
private String recognizeTextFromImage(InputImage image) {
TextRecognizer recognizer = TextRecognition.getClient(TextRecognizerOptions.DEFAULT_OPTIONS);
Task<Text> task = recognizer.process(image);
// recognizer.process(image)
task
.addOnSuccessListener(
new OnSuccessListener<Text>() {
@Override
public void onSuccess(Text texts) {
recognizedText = processTextRecognitionResult(texts);
Log.d(TAG,"Successful");
}
})
/*.addOnSuccessListener(
new OnSuccessListener<Text>() {
@Override
public void onSuccess(Text texts) {
recognizedText = texts.getText();
editText_Site.setText(recognizedText);
Log.d(TAG,"Successful");
}
})*/
.addOnFailureListener(
new OnFailureListener() {
@Override
public void onFailure(@NonNull Exception e) {
e.printStackTrace();
Log.d(TAG,"Not Successful");
}
});
recognizer.close();
return recognizedText;
}
private String processTextRecognitionResult(Text texts) {
String recognizedText = "";
List<Text.TextBlock> blocks = texts.getTextBlocks();
if (blocks.size() == 0) {
// No text found
}
else {
for (int i = 0; i < blocks.size(); i++) {
List<Text.Line> lines = blocks.get(i).getLines();
for (int j = 0; j < lines.size(); j++) {
List<Text.Element> elements = lines.get(j).getElements();
for (int k = 0; k < elements.size(); k++) {
String elementText = elements.get(k).getText();
recognizedText = recognizedText + elementText;
}
}
}
}
return recognizedText;
}
private void showToast(String message) {
Toast.makeText(getApplicationContext(), message, Toast.LENGTH_SHORT).show();
}
public void onItemSelected(AdapterView<?> parent, View v, int position, long id) {
spinner_Site = findViewById(R.id.spinner_Site);
editText_Site = findViewById(R.id.editText_Site);
editText_Site.setText(spinner_Site.getSelectedItem().toString());
}
@Override
public void onNothingSelected(AdapterView<?> parent) {
// Do nothing
}
}
在这个问题上工作了几个小时,我终于想出了解决办法@米奇,我在你的代码中发现了错误。因此,为了帮助你们,我在下面为你们提供解决方案。基本上,缺陷与旋转有关,如果图像未在横向模式下捕获,ocr无法正确识别文本,因此有必要在扫描前确保图像正确旋转。这行recognizedText=processTextRecognitionResult(文本)中也存在问题;它无法正确识别文本,你可以在下面查看我又添加了一个edittext,以显示文本发送方式的差异。getText()可以提供更好的结果。
检查下面的代码:
import android.app.Activity;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Matrix;
import android.media.ExifInterface;
import android.net.Uri;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.os.ParcelFileDescriptor;
import android.os.StrictMode;
import android.provider.MediaStore;
import android.util.Log;
import android.view.View;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.EditText;
import android.widget.ImageButton;
import android.widget.Spinner;
import android.widget.Toast;
import androidx.annotation.NonNull;
import androidx.annotation.RequiresApi;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.content.FileProvider;
import com.google.android.gms.tasks.OnFailureListener;
import com.google.android.gms.tasks.OnSuccessListener;
import com.google.android.gms.tasks.Task;
import com.google.mlkit.vision.common.InputImage;
import com.google.mlkit.vision.text.Text;
import com.google.mlkit.vision.text.TextRecognition;
import com.google.mlkit.vision.text.TextRecognizer;
import com.google.mlkit.vision.text.latin.TextRecognizerOptions;
import java.io.File;
import java.io.FileDescriptor;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import java.util.Locale;
public class MainActivity extends AppCompatActivity implements AdapterView.OnItemSelectedListener {
private static final String TAG = "MainActivity";
private ImageButton imageButton_Site;
private EditText editText_Site;
EditText editText;
private Spinner spinner_Site;
private InputImage createdImage;
String recognizedText="";
String currentPhotoPath;
static final int REQUEST_IMAGE_CAPTURE = 1;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
editText_Site=findViewById(R.id.editText_Site);
editText=findViewById(R.id.editText);
imageButton_Site = findViewById(R.id.imageButton_Site);
spinner_Site = findViewById(R.id.spinner_Site);
String[] items = new String[]{"Chicago", "Ludwigshafen"}; // has to be retrieved from server based on GPS (satellite)
ArrayAdapter<String> adapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item, items);
spinner_Site.setAdapter(adapter);
spinner_Site.setOnItemSelectedListener(this);
imageButton_Site.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
dispatchTakePictureIntent();
}
});
}
private void dispatchTakePictureIntent() {
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
// Ensure that there's a camera activity to handle the intent
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
// Create the File where the photo should go
File photoFile = null;
try {
photoFile = createImageFile();
} catch (IOException ex) {
Toast.makeText(getApplicationContext(), "Error occurred while creating the File!",Toast.LENGTH_LONG).show();
}
// Continue only if the File was successfully created
if (photoFile != null) {
Uri photoURI = FileProvider.getUriForFile(this,BuildConfig.APPLICATION_ID+".fileprovider",
photoFile);
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, photoURI);
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
}
}
}
@RequiresApi(api = Build.VERSION_CODES.N)
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult( requestCode, resultCode, data);
// Log.d("OLR", "onActivityResult: "+ requestCode +" "+resultCode+" "+data);
if (requestCode == REQUEST_IMAGE_CAPTURE && resultCode == RESULT_OK) {
ParcelFileDescriptor parcelFileDescriptor = null;
try {
parcelFileDescriptor = getContentResolver().openFileDescriptor(Uri.fromFile(new File(currentPhotoPath)), "r");
} catch (FileNotFoundException e) {
e.printStackTrace();
}
FileDescriptor fileDescriptor = parcelFileDescriptor.getFileDescriptor();
Bitmap bitmap = BitmapFactory.decodeFileDescriptor(fileDescriptor);
ExifInterface exifInterface = null;
try {
exifInterface = new ExifInterface(fileDescriptor);
} catch (IOException e) {
e.printStackTrace();
}
int orientation = exifInterface.getAttributeInt(ExifInterface.TAG_ORIENTATION, 1);
switch (orientation) {
case ExifInterface.ORIENTATION_ROTATE_90:
Matrix matrix = new Matrix();
matrix.setRotate(90);
bitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrix, true);
break;
case ExifInterface.ORIENTATION_ROTATE_180:
Matrix matrixe = new Matrix();
matrixe.setRotate(180);
bitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrixe, true);
break;
case ExifInterface.ORIENTATION_ROTATE_270:
Matrix matrixes = new Matrix();
matrixes.setRotate(270);
bitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrixes, true);
break;
case ExifInterface.ORIENTATION_NORMAL:
Matrix matrix12 = new Matrix();
matrix12.setRotate(ExifInterface.ORIENTATION_ROTATE_90);
bitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrix12, true);
}
InputImage inputImage=null;
inputImage= InputImage.fromBitmap(bitmap,0);
String text=null;
try {
text = recognizeTextFromImage(inputImage);
} catch (IOException e) {
e.printStackTrace();
}
if (text.isEmpty())
{
Toast.makeText(getApplicationContext(), "Nothing recognized. Please try again!",Toast.LENGTH_LONG).show();
editText_Site.setText("Failed !");
}
else {
editText_Site.setText(text);
}
}
else {
Toast.makeText(getApplicationContext(), "An issue occurred. Please inform app owner!",Toast.LENGTH_LONG).show();
}
}
private File createImageFile() throws IOException {
File storageDir = getExternalFilesDir(Environment.DIRECTORY_PICTURES);
// create directory if necessary
if (!storageDir.exists()){
storageDir.mkdir();
}
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.GERMANY).format(new Date());
String imageFileName = "OLR_" + timeStamp + "_";
File image = File.createTempFile(
imageFileName, // prefix
".jpg", // suffix
storageDir // directory
);
// imageFile = image;
currentPhotoPath = image.getAbsolutePath();
return image;
}
private String recognizeTextFromImage(InputImage image) throws IOException {
TextRecognizer recognizer = TextRecognition.getClient(TextRecognizerOptions.DEFAULT_OPTIONS);
Task<Text> task = recognizer.process(image)
.addOnSuccessListener(
new OnSuccessListener<Text>() {
@Override
public void onSuccess(Text texts) {
recognizedText = processTextRecognitionResult(texts);
editText.setText(texts.getText());
Log.d(TAG,"Successful");
}
})
.addOnFailureListener(
new OnFailureListener() {
@Override
public void onFailure(@NonNull Exception e) {
e.printStackTrace();
Log.d(TAG,"Not Successful");
}
});
return recognizedText;
}
private String processTextRecognitionResult(Text texts) {
String recognizedText = "";
List<Text.TextBlock> blocks = texts.getTextBlocks();
if (blocks.size() == 0) {
// No text found
}
else {
for (int i = 0; i < blocks.size(); i++) {
List<Text.Line> lines = blocks.get(i).getLines();
for (int j = 0; j < lines.size(); j++) {
List<Text.Element> elements = lines.get(j).getElements();
for (int k = 0; k < elements.size(); k++) {
String elementText = elements.get(k).getText();
recognizedText = recognizedText + elementText;
}
}
}
}
return recognizedText;
}
private void showToast(String message) {
Toast.makeText(getApplicationContext(), message, Toast.LENGTH_SHORT).show();
}
public void onItemSelected(AdapterView<?> parent, View v, int position, long id) {
spinner_Site = findViewById(R.id.spinner_Site);
editText_Site.setText(spinner_Site.getSelectedItem().toString());
}
@Override
public void onNothingSelected(AdapterView<?> parent) {
// Do nothing
}
}
问题解决了
(1) 我不得不添加意图。addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION)
(2) 我检查了返回的字符串“recognizedText”,该字符串始终为空,因为线程在变量填充后执行。灰烬在我头上。感谢@Danish,他给了我决定性的提示,让我把日志添加到正确的位置。
(3)由于任何原因,我必须等待一段时间才能开始调试。
我正在使用以下代码: 目的是将我的Wordpress网站上的分类名称和作者发送到Google Analytics(内容分组)。 然而,当我的网站被浏览时,我的源代码(在头部)仍然显示php标签。如何确保正确解析PHP代码并正确插入作者/类别信息?
我已经用自动完成和谷歌地理编码应用编程接口实现了谷歌地方应用编程接口。问题是结果似乎不正确。 有时自动完成列表中的一些选择结果根本没有地理编码,我得到的只是状态ZERO_RESULTS。 我知道Google Autocomplete也使用Places,而Google Geocode只使用邮政编码,这可能会有一些问题,但我如何限制Autocomplete只提供邮政编码结果呢。
问题内容: 我想直接在命令行执行单个php语句,而不必使用单独的php文件。 这怎么可能 ? 问题答案: 如果您要在命令行中使用PHP,建议您安装 phpsh,这是一个不错的PHP shell。这很有趣。 无论如何,php命令提供了 两个开关来从命令行执行代码 : 您可以这样使用php的 -r开关 : 上面的PHP命令应该 输出 并 返回, 如您所见: 另一个有趣的开关是 php -a : 与 p
我没有得到“你好”:请想知道为什么和做什么
我正在尝试使用Google云存储API,该API现已发布在App Engine网站的文档部分。文档指出,您必须将appengine服务帐户添加为API控制台中的团队成员。然而,我们在谷歌应用程序域中使用云存储,这只允许该域的用户作为团队成员添加。那么,不可能添加服务帐户(@appspot.gserviceaccount.com)吗?。有什么变通办法吗?
我有一个nodejs应用程序部署在谷歌应用引擎,它触发超文本传输协议云功能,认为简单的超文本传输协议调用(使用axios)从谷歌云Sql获取数据。 将使用该站点的每个人都将能够看到http请求并复制它。 什么是保护我的谷歌云功能的最好方法,只能从谷歌应用程序引擎调用?