当前位置: 首页 > 面试题库 >

WebRTC Java服务器故障

姬英武
2023-03-14
问题内容

我认为我非常接近让Java服务器应用程序通过WebRTC与浏览器页面对话,但是我不能完全使其正常工作。我感觉自己缺少一些小东西,因此希望这里有人可以提出建议。

我仔细研究了WebRTC示例-
Java单元测试(org.webrtc.PeerConnectionTest)和示例Android应用(trunk/talk/examples/android)。根据所学知识,我编写了一个Java应用程序,该应用程序使用WebSockets进行信号传输并尝试将视频流发送到Chrome。

问题是,即使我所有的代码(Javascript和Java)都按照我期望的顺序执行,并击中了所有正确的日志记录语句,浏览器中也没有视频。本地libjingle代码在控制台日志中有一些可疑的输出,但是我不确定该怎么做。我在日志中以“

”突出显示了可疑行。例如,似乎视频端口分配器在创建后不久就被销毁了,因此显然有些错误。另外,“ Changing video state, recv=1 send=0”似乎也不正确,因为Java端应该发送视频而不是接收…。也许我在滥用该OfferToReceiveVideo选项?

如果您查看下面的日志,您会发现与浏览器的WebSocket通信正常运行,并且能够将SDP报价成功发送到浏览器并从浏览器收到SDP答复。在PeerConnections上设置本地和远程描述似乎也可以正常工作。HTML5视频元素按原样将源设置为BLOB
url。那么,我可能会缺少什么呢?即使我的客户端和服务器现在在同一台计算机上,我也需要对ICE候选者做任何事情吗?

任何建议将不胜感激!

SDP消息(来自Chrome的Javascript控制台)

1.134: Java Offer: 
v=0
o=- 5893945934600346864 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE video
a=msid-semantic: WMS JavaMediaStream
m=video 1 RTP/SAVPF 100 116 117
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:dJxTlMlXy7uASrDU
a=ice-pwd:r8BRkXVnc4dqCABUDhuRjpp7
a=ice-options:google-ice
a=mid:video
a=extmap:2 urn:ietf:params:rtp-hdrext:toffset
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=sendrecv
a=rtcp-mux
a=crypto:0 AES_CM_128_HMAC_SHA1_80 inline:yq6wOHhk/QfsWuh+1oOEqfB4GjKZzz8XfQnGCDP3
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 goog-remb
a=rtpmap:116 red/90000
a=rtpmap:117 ulpfec/90000
a=ssrc:3720473526 cname:nul6R21KmwAms3Ge
a=ssrc:3720473526 msid:JavaMediaStream JavaMediaStream_v0
a=ssrc:3720473526 mslabel:JavaMediaStream
a=ssrc:3720473526 label:JavaMediaStream_v0


1.149: Received remote stream


1.150: Browsers Answer: 
v=0
o=- 4261396844048664099 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE video
a=msid-semantic: WMS
m=video 1 RTP/SAVPF 100 116 117
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:quzQNsX+ZlUWUQqV
a=ice-pwd:y5A0+7sM8P88AatBLd1fdd5G
a=mid:video
a=extmap:2 urn:ietf:params:rtp-hdrext:toffset
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=recvonly
a=rtcp-mux
a=crypto:0 AES_CM_128_HMAC_SHA1_80 inline:WClNA69OfpjdJy3Bv4ujejk/IYnn4DW8kjrB18xP
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 goog-remb
a=rtpmap:116 red/90000
a=rtpmap:117 ulpfec/90000

对我来说这没关系。Java的报价包括我的视频流。

本机代码记录(libjingle)

(标有“ >>”的可疑行)

Camera '/dev/video0' started with format YUY2 640x480x30, elapsed time 59 ms
Ignored line: c=IN IP4 0.0.0.0
NACK enabled for channel 0
NACK enabled for channel 0
Created channel for video
Jingle:Channel[video|1|__]: NULL DTLS identity supplied. Not doing DTLS
Jingle:Channel[video|2|__]: NULL DTLS identity supplied. Not doing DTLS
Session:5893945934600346864 Old state:STATE_INIT New state:STATE_SENTINITIATE Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
Setting local video description
AddSendStream {id:JavaMediaStream_v0;ssrcs:[3720473526];ssrc_groups:;cname:nul6R21KmwAms3Ge;sync_label:JavaMediaStream}
Add send ssrc: 3720473526
>> Warning(webrtcvideoengine.cc:2704): SetReceiverBufferingMode(0, 0) failed, err=12606
Changing video state, recv=0 send=0
Transport: video, allocating candidates
Transport: video, allocating candidates
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Udp
Jingle:Port[:1:0::Net[eth0:192.168.0.0/24]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Added port to allocator
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Udp
Jingle:Port[:1:0::Net[tun0:192.168.128.6/32]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Added port to allocator
Ignored line: c=IN IP4 0.0.0.0
Warning(webrtcvideoengine.cc:2309): GetStats: sender information not ready.
Jingle:Channel[video|1|__]: Other side didn't support DTLS.
Jingle:Channel[video|2|__]: Other side didn't support DTLS.
Enabling BUNDLE, bundling onto transport: video
Channel enabled
>> Changing video state, recv=1 send=0
Session:5893945934600346864 Old state:STATE_SENTINITIATE New state:STATE_RECEIVEDACCEPT Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
Setting remote video description
Hybrid NACK/FEC enabled for channel 0
Hybrid NACK/FEC enabled for channel 0
SetSendCodecs() : selected video codec VP8/1280x720x30fps@2000kbps (min=50kbps, start=300kbps)
Video max quantization: 56
VP8 number of temporal layers: 1
VP8 options : picture loss indication = 0, feedback mode = 0, complexity = normal, resilience = off, denoising = 0, error concealment = 0, automatic resize = 0, frame dropping = 1, key frame interval = 3000
WARNING: no real random source present!
SRTP activated with negotiated parameters: send cipher_suite AES_CM_128_HMAC_SHA1_80 recv cipher_suite AES_CM_128_HMAC_SHA1_80
Changing video state, recv=1 send=0
Session:5893945934600346864 Old state:STATE_RECEIVEDACCEPT New state:STATE_INPROGRESS Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Relay
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Relay
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Tcp
Jingle:Port[:1:0:local:Net[eth0:192.168.0.0/24]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Added port to allocator
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Tcp
Jingle:Port[:1:0:local:Net[tun0:192.168.128.6/32]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Added port to allocator
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=SslTcp
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=SslTcp
All candidates gathered for video:1:0
Transport: video, component 1 allocation complete
Transport: video allocation complete
Candidate gathering is complete.
Capture delay changed to 120 ms
Captured frame size 640x480. Expected format YUY2 640x480x30
Capture size changed : selected video codec VP8/640x480x30fps@2000kbps (min=50kbps, start=300kbps)
Video max quantization: 56
VP8 number of temporal layers: 1
VP8 options : picture loss indication = 0, feedback mode = 0, complexity = normal, resilience = off, denoising = 0, error concealment = 0, automatic resize = 1, frame dropping = 1, key frame interval = 3000
VAdapt Frame: 0 / 300 Changes: 0 Input: 640x480 Scale: 1 Output: 640x480 Changed: false
>> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Port deleted
>> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Removed port from allocator (3 remaining)
Removed port from p2p socket: 3 remaining
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Port deleted
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Removed port from allocator (2 remaining)
Removed port from p2p socket: 2 remaining
>> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Port deleted
>> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Removed port from allocator (1 remaining)
Removed port from p2p socket: 1 remaining
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Port deleted
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Removed port from allocator (0 remaining)
Removed port from p2p socket: 0 remaining

的HTML

<html lang="en">
    <head>
        <title>Web Socket Signalling</title>
        <link rel="stylesheet" href="css/socket.css">
        <script src="js/socket.js"></script>
    </head>
    <body>
        <h2>Repsonse from Server</h2>
        <textarea id="responseText"></textarea>

        <h2>Video</h2>
        <video id="remoteVideo" autoplay></video>
    </body>
</html>

Java脚本

(function() {
  var remotePeerConnection;
  var sdpConstraints = {
    'mandatory' : {
      'OfferToReceiveAudio' : false,
      'OfferToReceiveVideo' : true
    }
  };


  var Sock = function() {
    var socket;
    if (!window.WebSocket) {
      window.WebSocket = window.MozWebSocket;
    }


    if (window.WebSocket) {
      socket = new WebSocket("ws://localhost:8080/websocket");
      socket.onopen = onopen;
      socket.onmessage = onmessage;
      socket.onclose = onclose;
    } else {
      alert("Your browser does not support Web Socket.");
    }


    function onopen(event) {
      getTextAreaElement().value = "Web Socket opened!";
    }


    function onmessage(event) {
      appendTextArea(event.data);


      sdpOffer = new RTCSessionDescription(JSON.parse(event.data));


      remotePeerConnection = new webkitRTCPeerConnection(null);
      remotePeerConnection.onaddstream = gotRemoteStream;


      trace("Java Offer: \n" + sdpOffer.sdp);
      remotePeerConnection.setRemoteDescription(sdpOffer);
      remotePeerConnection.createAnswer(gotRemoteDescription, onCreateSessionDescriptionError, sdpConstraints);


    }
    function onCreateSessionDescriptionError(error) {
      console.log('Failed to create session description: '
          + error.toString());
    }

    function gotRemoteDescription(answer) {
      remotePeerConnection.setLocalDescription(answer);
      trace("Browser's Answer: \n" + answer.sdp);


      socket.send(JSON.stringify(answer));
    }


    function gotRemoteStream(event) {
      var remoteVideo = document.getElementById("remoteVideo");
      remoteVideo.src = URL.createObjectURL(event.stream);
      trace("Received remote stream");
    }


    function onclose(event) {
      appendTextArea("Web Socket closed");
    }


    function appendTextArea(newData) {
      var el = getTextAreaElement();
      el.value = el.value + '\n' + newData;
    }


    function getTextAreaElement() {
      return document.getElementById('responseText');
    }


    function trace(text) {
      console.log((performance.now() / 1000).toFixed(3) + ": " + text);
    }


  }
  window.addEventListener('load', function() {
    new Sock();
  }, false);
})();

Java服务器

public class PeerConnectionManager {

   /**
    * Called when the WebSocket handshake is completed
    */
   public void createOffer() {

      peerConnection = factory.createPeerConnection(
            new ArrayList<PeerConnection.IceServer>(),
            new MediaConstraints(), 
            new PeerConnectionObserverImpl());


      // Get the video source
      videoSource = factory.createVideoSource(VideoCapturer.create(""), new MediaConstraints());


      // Create a MediaStream with one video track
      MediaStream lMS = factory.createLocalMediaStream("JavaMediaStream");
      VideoTrack videoTrack = factory.createVideoTrack("JavaMediaStream_v0", videoSource);
      videoTrack.addRenderer(new VideoRenderer(new VideoRendererObserverImpl()));
      lMS.addTrack(videoTrack);
      peerConnection.addStream(lMS, new MediaConstraints());

      // We don't want to receive anything
      MediaConstraints sdpConstraints = new MediaConstraints();
      sdpConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
            "OfferToReceiveAudio", "false"));
      sdpConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
            "OfferToReceiveVideo", "false"));


      // Get the Offer SDP
      SdpObserverImpl sdpOfferObserver = new SdpObserverImpl();
      peerConnection.createOffer(sdpOfferObserver, sdpConstraints);
      SessionDescription offerSdp = sdpOfferObserver.getSdp();

      // Set local SDP, don't care for any callbacks
      peerConnection.setLocalDescription(new SdpObserverImpl(), offerSdp);


      // Serialize Offer and send to the Browser via a WebSocket
      JSONObject offerSdpJson = new JSONObject();
      offerSdpJson.put("sdp", offerSdp.description);
      offerSdpJson.put("type", offerSdp.type.canonicalForm());
      webSocketContext.channel().writeAndFlush(
            new TextWebSocketFrame(offerSdpJson.toString()));


   }


   /**
    * Called when an SDP Answer arrives via the WebSocket
    */
   public void setRemoteDescription(SessionDescription answer) {
      peerConnection.setRemoteDescription( new SdpObserverImpl(), answer);

   }
}

问题答案:

啊。没关系。很抱歉这个愚蠢的问题。

缺少的部分是浏览器和Java服务器之间的ICE候选者交换。现在,我添加了通过WebSocket进行ICE协商的代码,一切正常!



 类似资料:
  • 问题内容: 我的一台Redis服务器今天反复停机,没有任何明显的可诊断原因。我所有的用户最终都会遇到错误。 查看处的日志,最后几行捕获的内容比计划的备份更为有害: pid文件仍然存在。这意味着服务器没有被正式关闭,redis仍被守护? 我登录到系统,并做了两次以使其启动并运行。除了这些日志,我还能如何诊断可能出了什么问题? 更新:我注意到在第一次崩溃时,磁盘交换开始发生。这从未发生过。此外,确认将

  • 代码如下 @WebService公共接口CifService{ } @WebService(endpoint Interface="service.service.CifService", Target etNamespace="http://fs.service/", serviceName="FsService", portName="FsPort")公共类CifServiceImpl实现Ci

  • 我试图在集群故障切换期间测试我的软件行为,因此我想配置一个最简单的集群:一个主集群和两个从集群。我有以下内容的树文件7000.conf-7002.conf: cluster.conf的内容: 然后我配置了7000运行从0到16383的所有插槽,7001和7002是7000的副本: 然后我尝试通过命令或杀死进程来摆脱7000。其中一个奴隶应该提升自己为主人,但没有人这样做: 我已经等了几分钟了,我的

  • 我有一个由4台服务器组成的Redis V4.0.7集群。这4台服务器都在运行我Windows PC上的Ubuntu V17.1064位虚拟机(在VirtualBox中)。我已经转移了所有的从属1服务器,并将在下面对我的“问题”的解释中使用M1作为主1以及S1作为从属1。 null

  • 本文向大家介绍Linux Web服务器网站故障分析常用命令,包括了Linux Web服务器网站故障分析常用命令的使用技巧和注意事项,需要的朋友参考一下 Linux Web服务器网站故障分析,具体内容如下 系统连接状态篇: 1.查看TCP连接状态 2.查找请求数请20个IP(常用于查找攻来源): 3.用tcpdump嗅探80端口的访问看看谁最高 tcpdump -i eth0 -tnn dst po

  • 我有3个复制的Redis实例运行在3台不同的机器上:A、B和C。我最初选择A作为我的主机。我还有3个哨兵(每台机器上有1个)监视A。