当前位置: 首页 > 面试题库 >

我使用image_picker录制视频,但是我想限制视频的时间

东郭自珍
2023-03-14
问题内容

我使用image_picker开发了视频录制。我们需要限制视频录制时间。

pubspec.yaml依赖项:image_picker:^ 0.4.10

[flutter] flutter doctor
Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel beta, v1.0.1-pre.2, on Mac OS X 10.14.2 18C54, locale zh-Hans-CN)
[✓] Android toolchain - develop for Android devices (Android SDK version 28.0.3)
[✓] iOS toolchain - develop for iOS devices (Xcode 10.1)
[✓] Android Studio (version 3.0)
[✓] IntelliJ IDEA Ultimate Edition (version 2018.1.7)
[✓] VS Code (version 1.31.1)
[✓] Connected device (2 available)

• No issues found!
exit code 0


// Open the camera for recording Code
ImagePicker.pickVideo(source: ImageSource.camera).then((File file) {
    if (file != null && mounted) {
        var tempFile = file;
    }
});

我想在打开相机之前设置录制时间。我
该怎么办?


问题答案:

@ Coding24h:

以下是捕获视频并播放视频的dart文件,说明:

导入自我飞镖:
我写的dart文件:

(1)’GlobalVariables.dart’-包含带有静态变量的类’gv’,
所有“页面/小部件”均可访问

(2)’LangStrings.dart’-此应用程序是多国语言,此dart文件包含
不同语言的字符串(英语,中文......)

(3)’ScreenVariables.dart’-包含所有与屏幕相关的变量,例如
方向,高度,宽度,物理高度,devicepixelratio ......

(4)“ Utilities.dart”-包含可被所有
“页面” 使用的“实用程序功能”,例如,在任何时候,任何地方显示一条祝酒消息。

  1. InitState()方法:在此声明并初始化相机控件。

  2. dispose()方法:在以下情况下,视频录制将手动停止:(1)用户单击“停止”按钮,或者(2)用户离开此页面。当用户切换到另一个应用程序或关闭手机屏幕时,视频录制也会自动停止。无论哪种情况,都应将相机控制对象放置在dispose()中

  3. didChangeAppLifecycleState()方法:如果应用程序进入后台,则暂停视频播放。

  4. funTimerVideo()方法:一个计时器,每秒更改“滑块”的位置。(如果该用户有录制限制,则可以创建另一个计时器来停止视频录制)

  5. funSelectVideo()方法:从图库中选择视频。

  6. funCameraStart()方法:开始捕获图片以备将来预览,然后开始捕获视频

  7. funCameraStop()方法:停止捕获视频,如果出现以下情况,则可以调用此方法:(1)用户按下“视频录制停止”按钮,或者(2)当“ funTimerVideo”中提到的“另一个计时器”调用此方法时,超出记录限制。

程序源代码(仅适用于视频播放器和捕获页面):

// Import Flutter Darts
import 'dart:io';
import 'package:camera/camera.dart';
import 'package:file_picker/file_picker.dart';
import 'package:flutter/cupertino.dart';
import 'package:flutter/material.dart';
import 'package:font_awesome_flutter/font_awesome_flutter.dart';
import 'package:intl/intl.dart';
import "package:threading/threading.dart";
import 'package:video_player/video_player.dart';

// Import Self Darts
import 'GlobalVariables.dart';
import 'LangStrings.dart';
import 'ScreenVariables.dart';
import 'Utilities.dart';

// Import Pages
import 'BottomBar.dart';

// Home Page
class ClsHome extends StatefulWidget {
  @override
  _ClsHomeState createState() => _ClsHomeState();
}
class _ClsHomeState extends State<ClsHome> with WidgetsBindingObserver {
  AppLifecycleState _lastLifecycleState;

  // Declare Camera
  CameraController ctlCamera;

  // Var for Video
  bool bolVideoPaused = true;

  @override
  void initState() {
    super.initState();
    print("Init State Started");
    if (gv.bolHomeFirstInit) {
      // This page is called by navigator.push twice, do nothing on the first call
      gv.bolHomeFirstInit = false;
    } else {
      // Not the first time call of Init, do Init
      WidgetsBinding.instance.addObserver(this);

      try {
        // Try to dispose old Camera Control
        ctlCamera.dispose();
        print("Camera Disposed 1");
      } catch (err) {
        print("Camera Dispose Error: " + err.toString());
      }
      try {
        // Declare New Camera Control
        ctlCamera = CameraController(gv.cameras[1], ResolutionPreset.high);
        ctlCamera.initialize().then((_) {
          if (!mounted) {
            ut.showToast('1:' + ls.gs('SystemErrorOpenAgain'), true);
            return;
          }
          setState(() {});
          print('Controller Inited');
        });
      } catch (err) {
        ut.showToast('2:' + ls.gs('SystemErrorOpenAgain'), true);
        print("Camera Init Error: " + err.toString());
      }
      try {
        gv.threadHomeVideo = new Thread(funTimerVideo);
        gv.threadHomeVideo.start();
        print('New Video Timer Started');
      } catch (err) {
        ut.showToast('3:' + ls.gs('SystemErrorOpenAgain'), true);
        print('New Video Timer Error: ' + err.toString());
      }
    }
    print("Init State Ended");
  }

  @override
  void dispose() async {
    super.dispose();
    print("Dispose Started");
    if (gv.bolHomeFirstDispose) {
      gv.bolHomeFirstDispose = false;
    } else {
      WidgetsBinding.instance.removeObserver(this);

      try {
        await funCameraStop();
        ctlCamera?.dispose();

        print("Camera Disposed");
      } catch (err) {
        //
        print("Play Video Dispose Error 1: " + err.toString());
      }
      try {
        // gv.ctlVideo?.dispose();
        gv.ctlVideo.pause();
        // gv.threadPageHomeVideo.abort();
        // print('Thread Video Aborted');
      } catch (err) {
        //
        print("Play Video Dispose Error 2: " + err.toString());
      }
      // print('Controller dispose');
    }
    print("Dispose Ended");
  }

  @override
  void didChangeAppLifecycleState(AppLifecycleState state) {
      _lastLifecycleState = state;
      print('*****   Life Cycle State: ' + _lastLifecycleState.toString() + '   *****');
      if (_lastLifecycleState.toString() == 'AppLifecycleState.paused') {
        try {
          if (gv.ctlVideo.value.isPlaying) {
            bolVideoPaused = true;
            gv.ctlVideo.pause();
          }
          setState(() {});
        } catch (err) {
          //
        }
      } else if (_lastLifecycleState.toString() == 'AppLifecycleState.resumed') {
      }
  }

  // Timer to setState Video Play Position
  void funTimerVideo() async {
    while (true) {
      await Thread.sleep(1000);
      try {
        if (gv.ctlVideo.value.isPlaying) {
          gv.dblHomeVDSliderValueMS =
              gv.ctlVideo.value.position.inMilliseconds.toDouble();
          setState(() {});
        }
      } catch (err) {
        // Video Not Yet Ready, Do Nothing
      }
    }
  }

  // Select Video from External Storage
  funSelectVideo() async {
    print('Select Video Started');
    String filePath = '';
    filePath = await FilePicker.getFilePath(type: FileType.VIDEO);
    if (filePath != '') {
      try {
        // Declare Video if a video file is selected
        gv.ctlVideo = VideoPlayerController.file(File(filePath))
          ..initialize().then((_) {
            // Set Video Looping
            gv.ctlVideo.setLooping(true);

            // Get Video Duration in Milliseconds
            gv.intHomeVDMS = gv.ctlVideo.value.duration.inMilliseconds;
            setState(() {});
            print('Video Inited');
          });
      } catch (err) {
        print('Video Init Error: ' + err.toString());
        gv.intHomeVDMS = 0;
        ut.showToast(ls.gs('VideoErrorUnsupport'), true);
      }
    } else {
      print('No Video Selected');
      setState(() {});
    }
    print('Select Video Ended');
  }

  // The Widget that show the Video Player
  Widget ctnVideoPlayer() {
    try {
      double dblHeight = sv.dblBodyHeight / 1.8;
      print('Before Check Video Init');
      if (gv.ctlVideo.value.initialized) {
        print('Before Check Video AspectRatio');
        if (gv.ctlVideo.value.aspectRatio < 1) {
          dblHeight = sv.dblBodyHeight / 1.25;
        }
        print('Before Return ctnVideoPlayer');
        return Column(
          mainAxisAlignment: MainAxisAlignment.center,
          crossAxisAlignment: CrossAxisAlignment.center,
          children: <Widget>[
            Container(
              padding: EdgeInsets.fromLTRB(0, 10, 0, 10),
              height: dblHeight,
              width: sv.dblScreenWidth,
              child: Center(
                child: AspectRatio(
                  aspectRatio: gv.ctlVideo.value.aspectRatio,
                  child: VideoPlayer(gv.ctlVideo),
                ),
              ),
            ),
            objVideoSlider(),
          ],
        );
        print('After Return ctnVideoPlayer');
      } else {
        print('Before Return EMPTY ctnVideoPlayer');
        return Container(
          // color: Colors.white,
          height: dblHeight,
          width: sv.dblScreenWidth,
          child: Center(
            child: Text(ls.gs('SelectVideo')),
          ),
        );
        print('After Return EMPTY ctnVideoPlayer');
      }
    } catch (err) {
      print('Page Home ctnVideoPlayer() : ' + err.toString());
      return Container(
        // color: Colors.white,
        height: sv.dblBodyHeight / 1.8,
        width: sv.dblScreenWidth,
        child: Center(
          child: Text(ls.gs('SelectVideo')),
        ),
      );
    }
  }

  // function when Play or Pause clicked
  funPlayVideo() async {
    try {
      if (gv.ctlVideo.value.initialized) {
        if (gv.ctlVideo.value.isPlaying) {
          bolVideoPaused = true;
          gv.ctlVideo.pause();

          // Stop Camera Recording
          funCameraStop();
        } else {
          bolVideoPaused = false;
          gv.ctlVideo.play();

          // Start Camera Recording
          funCameraStart();
        }
        setState(() {});
      } else {
        // Do Nothing
      }
    } catch (err) {
      // Do Nothing
    }
  }

  // function when Forward 15 seconds clicked
  funForwardVideo() async {
    try {
      if (gv.ctlVideo.value.initialized) {
        gv.ctlVideo.seekTo(gv.ctlVideo.value.position + Duration(seconds: 15));
        setState(() {});
      } else {
        // Do Nothing
      }
    } catch (err) {
      // Do Nothing
    }
  }

  // function when Backward 15 seconds clicked
  funBackwardVideo() async {
    try {
      if (gv.ctlVideo.value.initialized) {
        gv.ctlVideo.seekTo(gv.ctlVideo.value.position - Duration(seconds: 15));
        setState(() {});
      } else {
        // Do Nothing
      }
    } catch (err) {
      // Do Nothing
    }
  }

  // Widget to show the Slider of the playing position of Video
  Widget objVideoSlider() {
    try {
      if (gv.ctlVideo.value.initialized) {
        return Row(
          mainAxisAlignment: MainAxisAlignment.center,
          crossAxisAlignment: CrossAxisAlignment.center,
          children: <Widget>[
            Text(' '),
            Text(gv.ctlVideo.value.position.inHours.toString() +
                ":" +
                (gv.ctlVideo.value.position.inMinutes % 60)
                    .toString()
                    .padLeft(2, '0') +
                ":" +
                (gv.ctlVideo.value.position.inSeconds % 60)
                    .toString()
                    .padLeft(2, '0')),
            Expanded(
              child: CupertinoSlider(
                min: 0.0,
                max: gv.intHomeVDMS.toDouble(),
                divisions: (gv.intHomeVDMS / 1000).toInt(),
                value: gv.dblHomeVDSliderValueMS,
                onChanged: (double dblNewValue) {
                  objVideoSliderChanged(dblNewValue);
                },
              ),
            ),
            Text(gv.ctlVideo.value.duration.inHours.toString() +
                ":" +
                (gv.ctlVideo.value.duration.inMinutes % 60)
                    .toString()
                    .padLeft(2, '0') +
                ":" +
                (gv.ctlVideo.value.duration.inSeconds % 60)
                    .toString()
                    .padLeft(2, '0')),
            Text(' '),
          ],
        );
      } else {
        return Container();
      }
    } catch (err) {
      return Container();
    }
  }

  // Function when Slider Changed Manually
  objVideoSliderChanged(dblNewValue) {
    gv.dblHomeVDSliderValueMS = dblNewValue;
    gv.ctlVideo
        .seekTo(Duration(milliseconds: gv.dblHomeVDSliderValueMS.toInt()));
    setState(() {});
  }

  // Function Start Camera
  void funCameraStart() async{
    // Declare File Name
    DateTime dtTimeStamp() => DateTime.now();
    String strTimeStamp = DateFormat('yyyyMMdd_kkmmss').format(dtTimeStamp());
    String strMovieFile = gv.strMoviePath + '/' + strTimeStamp + '.mp4';
    gv.strImageFile = gv.strImagePath + '/' + strTimeStamp;
    print('File Path: ' + strMovieFile);

    try {
      await ctlCamera.takePicture(gv.strImageFile + '_01.jpg');
      await ctlCamera.startVideoRecording(strMovieFile);
    } catch(err) {
      ut.showToast('4:' + ls.gs('SystemErrorOpenAgain'), true);
    }
  }
  // Function Stop Camera
  void funCameraStop() async{
    try {
      await ctlCamera.stopVideoRecording();
    } catch(err) {
      // ut.showToast('5:' + ls.gs('SystemErrorOpenAgain'), true);
    }
    try {
      await ctlCamera.takePicture(gv.strImageFile + '_02.jpg');
    } catch(err) {
      // ut.showToast('5:' + ls.gs('SystemErrorOpenAgain'), true);
    }
  }

  // Main Widget
  @override
  Widget build(BuildContext context) {
    try {
      if (ctlCamera != null) {
        if (!ctlCamera.value.isInitialized) {
          print('return Container');
          return Container();
        }
        print('Before Return');
        return Scaffold(
          appBar: PreferredSize(
            child: AppBar(
              title: Text(
                ls.gs('Player'),
                style: TextStyle(fontSize: sv.dblDefaultFontSize),
              ),
            ),
            preferredSize: new Size.fromHeight(sv.dblTopHeight),
          ),
          body: Column(
              mainAxisAlignment: MainAxisAlignment.center,
              crossAxisAlignment: CrossAxisAlignment.center,
              children: <Widget>[
                ctnVideoPlayer(),
                Stack(children: <Widget>[
                  Container(
                    // color: Colors.white,
                    height: sv.dblBodyHeight / 25,
                    width: sv.dblScreenWidth,
                    child: Center(
                      child: Row(
                          mainAxisAlignment: MainAxisAlignment.center,
                          crossAxisAlignment: CrossAxisAlignment.center,
                          children: <Widget>[
                            Text('          '),
                            AspectRatio(
                              aspectRatio: ctlCamera.value.aspectRatio,
                              child: CameraPreview(ctlCamera),
                            ),
                          ]),
                    ),
                  ),
                  Container(
                    // color: Colors.white,
                    height: sv.dblBodyHeight / 25,
                    width: sv.dblScreenWidth,
                    child: Center(
                      child: Row(
                        mainAxisAlignment: MainAxisAlignment.center,
                        crossAxisAlignment: CrossAxisAlignment.center,
                        children: <Widget>[
                          RaisedButton(
                            onPressed: () => funSelectVideo(),
                            child: Icon(Icons.folder_open),
                          ),
                          Text(' '),
                          RaisedButton(
                            onPressed: () => funBackwardVideo(),
                            child: Icon(FontAwesomeIcons.angleDoubleLeft),
                          ),
                          Text(' '),
                          RaisedButton(
                            onPressed: () => funPlayVideo(),
                            child: bolVideoPaused
                                ? Icon(Icons.play_arrow)
                                : Icon(Icons.pause),
                          ),
                          Text(' '),
                          RaisedButton(
                            onPressed: () => funForwardVideo(),
                            child: Icon(FontAwesomeIcons.angleDoubleRight),
                          ),
                        ],
                      ),
                    ),
                  ),
                ])
              ]),
          bottomNavigationBar: ClsBottom(),
        );
      } else {
        return Container();
      }
    } catch (err) {
      print('PageHome Error build: ' + err.toString());
      return Container();
    }
  }
}


 类似资料:
  • 我正在尝试开发一个应用程序,允许我在录制视频时绘制视频,然后将录制的视频和视频保存在一个mp4文件中供以后使用。另外,我想使用camera2库,特别是我需要我的应用程序在高于API 21的设备上运行,我总是避免使用不推荐的库。 我尝试了很多方法,包括FFmpeg,其中我放置了TextureView的覆盖层。getBitmap()(来自摄影机)和从画布获取的位图。它工作正常,但由于它的功能很慢,视频

  • 谷歌发布了新的CameraX库,作为JetPack的一部分。它看起来很适合拍照,但我的用例也需要制作视频的。我试着在谷歌上搜索,但什么也找不到。 那么,用CameraX Jetpack库录制视频是可能的吗?

  • 我们可以在录制视频时重置mediaRecorder中的所有值吗?我尝试在录制视频时使用。但行不通。我不知道有没有可能。如果有可能,请任何参考将不胜感激。 我读过这篇文章,也读过谷歌开发者,开发者中的mediaRecorder。但任何参考文献都没有提到我的问题。 编辑: 感谢您宝贵的时间。

  • 我们有一台摄像机,记录高FPS率-163的视频。 谢谢!

  • 因此,我使用了以下代码: 而我的问题: 一个叫做appsrc的视频文件!自动视频转换!omxh264enc!matroskamux!创建filesink location=test2.mkv sync=false(而不是test2.mkv),并且它为空。同样的问题被纠正了,并且在下面的线程中似乎可以工作,对我来说没有任何影响:用VideoWriter从OpenCV打开一个GStreamer管道 n

  • 我试图在一个单独的文件中每40毫秒生成一个视频和音频,并将其发送到云端进行直播,但创建的视频和音频无法使用ffplay播放。 命令: ffmpeg-f alsa-thread_queue_size1024-i hw: 0-f video o4linux2-i /dev/video0-c: a aac-ar48k-t 0:10-segment_time00:00.04-f段sample-3d.aac

  • 我能够在MediaCodec和MediaMuxer的帮助下录制(编码)视频。接下来,我需要在MediaCodec和MediaMuxer的帮助下处理音频部分和带视频的mux音频。 我面临两个问题: > 如何将音频和视频数据传递给MediaMuxer(因为writeSampleData()方法一次只接受一种类型的数据)? 我提到了MediaMuxerTest,但它使用的是MediaExtractor。