使用 Javascript 进行 1:1 调用 .NET MAUI、Azure 通信服务的单向音频

问题描述 投票:0回答:1

我正在使用 Azure 通信服务 (ACS) 和 .NET MAUI 应用程序进行 1:1 视频通话。我已经设法让视频以两种方式工作,但音频只能以一种方式工作。被叫方可以听到主叫方的声音,但主叫方听不到被叫方的声音。我尝试过使用 RemoteAudioStream 但仍然没有成功。我已经正确完成了权限部分,至少在 .NET MAUI 方面是这样。我已多次检查清单中的应用程序权限以及应用程序详细信息/权限。

基本上从输出日志中我没有看到任何错误:

CallAgent initialized.
main.js:61 Device permissions granted.
main.js:65 Available microphones: Array(3)
main.js:71 Selected microphone: Headset earpiece
main.js:80 Incoming call received.
main.js:85 Notifying user of incoming call.
main.js:127 Call accepted.
main.js:273 Local participant is muted: false
main.js:312 Remote participant added: 8:acs:7fd9cad8-493b-4fd3-9e67-48bd84547a61_00000022-e7e6-7d84-8ed3-....
main.js:315 Remote participant is muted: false
main.js:276 Call state changed to: EarlyMedia
main.js:276 Call state changed to: Connected
main.js:280 Connected label shown.
main.js:336 Remote participant state changed to: Connected
main.js:330 Remote video stream removed

当我使用常规 Chrome 浏览器或通过桌面时,它工作得很好,甚至当我从桌面呼叫到 Android 时,它也工作得很好,只是不是从移动设备到移动设备。

我提供了我在下面使用的javascript

// Import necessary ACS modules
const { CallClient, VideoStreamRenderer, LocalVideoStream } = require('@azure/communication-calling');
const { AzureCommunicationTokenCredential } = require('@azure/communication-common');
const { AzureLogger, setLogLevel } = require("@azure/logger");

// Set the log level and output for debugging purposes
setLogLevel('error'); // Change to 'info' or 'verbose' for more detailed logs during development
AzureLogger.log = (...args) => {
    console.error(...args);
};

// Variables to store ACS objects and call state
let callAgent;
let deviceManager;
let call;
let currentCall;
let localVideoStream;
let localVideoRenderer;
let remoteVideoRenderer;
let isVideoStarted = false; // Track video state

// UI elements
let remoteVideosGallery = document.getElementById('remoteVideosGallery');
let localVideoContainer = document.getElementById('localVideoContainer');
let connectedLabel = document.getElementById('connectedLabel'); // Ensure this element exists in HTML
let localAudioIndicator = document.getElementById('localAudioIndicator'); // Optional: Add this in your HTML

// Extract query parameters
const urlParams = new URLSearchParams(window.location.search);
const token = urlParams.get('token');
const userId = urlParams.get('userId');
const targetId = urlParams.get('targetId');
const role = urlParams.get('role'); 

// Function to select the best microphone
function selectBestMicrophone(mics) {
    // Prefer 'Headset earpiece' if available
    const headset = mics.find(m => m.name.toLowerCase().includes('headset'));
    if (headset) return headset;
    
    // Fallback to 'microphone:default' if available
    const defaultMic = mics.find(m => m.id === 'microphone:default');
    if (defaultMic) return defaultMic;
    
    // Otherwise, return the first available microphone with a valid name
    return mics.find(m => m.name && m.name.trim() !== '') || mics[0];
}

// Function to initialize the Call Agent
async function initializeCallAgent(token, userId, targetId, role) {
    try {
        const credential = new AzureCommunicationTokenCredential(token);
        const callClient = new CallClient();

        callAgent = await callClient.createCallAgent(credential);
        console.log('CallAgent initialized.');

        // Initialize the device manager to access camera and microphone
        deviceManager = await callClient.getDeviceManager();
        await deviceManager.askDevicePermission({ video: true, audio: true });
        console.log('Device permissions granted.');

        // Select the best microphone
        const microphones = await deviceManager.getMicrophones();
        console.log('Available microphones:', microphones);

        let selectedMicrophone = selectBestMicrophone(microphones);

        if (selectedMicrophone) {
            await deviceManager.selectMicrophone(selectedMicrophone);
            console.log(`Selected microphone: ${selectedMicrophone.name || 'Default Microphone'}`);
        } else {
            console.error('No suitable microphone device found.');
            alert('No suitable microphone detected. Please connect a microphone and refresh the page.');
            return;
        }

        // Handle incoming calls
        callAgent.on('incomingCall', async (args) => {
            console.log('Incoming call received.');
            currentCall = args.incomingCall;

            // Notify the user of the incoming call
            setTimeout(async () => {
                console.log('Notifying user of incoming call.');
                alert('Incoming call received. Press OK to accept.');
                await acceptCall(); // Automatically accept the call for testing
            }, 1000);
        });
    } catch (error) {
        console.error('Error initializing CallAgent:', error);
        alert(`Failed to initialize call. Error: ${error.message}`);
    }
}

async function startCall() {
    if (!callAgent) {
        console.error('CallAgent is not initialized.');
        alert('CallAgent is not initialized.');
        return;
    }

    try {
        const targetUser = { communicationUserId: targetId };
        const callOptions = {
            audioOptions: { muted: false },  // Start with audio, no video
            // videoOptions: { localVideoStreams: [] } // Uncomment if starting without video
        };

        call = callAgent.startCall([targetUser], callOptions);
        console.log('Call initiated.');
        setupCall(call);
    } catch (error) {
        console.error('Error starting call:', error);
        alert('Failed to start call. Please check the console for more details.');
    }
}

async function acceptCall() {
    if (currentCall) {
        try {
            const callOptions = {
                audioOptions: { muted: false },  // Start with audio
                // videoOptions: { localVideoStreams: [] } // Uncomment if accepting without video
            };
            call = await currentCall.accept(callOptions);
            console.log('Call accepted.');
            setupCall(call);
        } catch (error) {
            console.error('Error accepting call:', error);
            alert('Failed to accept the call.');
        }
    } else {
        console.error('No incoming call to accept.');
    }
}

async function declineCall() {
    if (currentCall) {
        try {
            await currentCall.reject();
            console.log('Call declined.');
            currentCall = null;
        } catch (error) {
            console.error('Error declining call:', error);
            alert('Failed to decline the call.');
        }
    }
}

async function hangUpCall() {
    if (call) {
        try {
            await call.hangUp();
            console.log('Call hung up.');
            call = null;
            currentCall = null;
            isVideoStarted = false;  // Reset video state after hang up

            // Clear video containers when the call ends
            clearVideoContainers();
        } catch (error) {
            console.error('Error hanging up call:', error);
            alert('Failed to hang up the call.');
        }
    }
}

// Clear video elements (both local and remote)
function clearVideoContainers() {
    if (localVideoRenderer) {
        localVideoRenderer.dispose();  // Dispose the local video renderer
        localVideoRenderer = null;
        localVideoContainer.hidden = true;
    }

    if (remoteVideoRenderer) {
        remoteVideoRenderer.dispose();  // Dispose the remote video renderer
        remoteVideoRenderer = null;
    }
    
    remoteVideosGallery.innerHTML = ''; // Clear all remote video elements
}

// Start video
async function startVideo() {
    if (!call) {
        console.error('No active call to start video.');
        alert('No active call to start video.');
        return;
    }

    try {
        if (!localVideoStream) {
            localVideoStream = await createLocalVideoStream();
        }

        await call.startVideo(localVideoStream);
        console.log('Video started.');
        await displayLocalVideoStream();
        isVideoStarted = true;
    } catch (error) {
        console.error('Error starting video:', error);
        alert('Failed to start video.');
    }
}

// Stop video
async function stopVideo() {
    if (!call || !isVideoStarted) {
        console.error('No active video to stop.');
        alert('No active video to stop.');
        return;
    }

    try {
        await call.stopVideo(localVideoStream);
        console.log('Video stopped.');
        isVideoStarted = false;

        // Remove local video stream and replace it with a black screen
        if (localVideoRenderer) {
            localVideoRenderer.dispose();
            localVideoRenderer = null;
            localVideoContainer.style.backgroundColor = 'black'; // Black background
        }
    } catch (error) {
        console.error('Error stopping video:', error);
        alert('Failed to stop video.');
    }
}

// Create local video stream
async function createLocalVideoStream() {
    const cameras = await deviceManager.getCameras();
    if (cameras.length > 0) {
        const camera = cameras[0];
        console.log(`Using camera: ${camera.name || 'Default Camera'}`);
        localVideoStream = new LocalVideoStream(camera);
        return localVideoStream;
    } else {
        console.error('No camera device found.');
        alert('No camera device detected. Please connect a camera and refresh the page.');
        return null;
    }
}

// Display local video stream
async function displayLocalVideoStream() {
    try {
        if (localVideoRenderer) {
            localVideoRenderer.dispose();  // Dispose of existing renderer to prevent duplicates
        }

        if (localVideoStream) {
            localVideoRenderer = new VideoStreamRenderer(localVideoStream);
            const view = await localVideoRenderer.createView();
            localVideoContainer.innerHTML = '';  // Clear any existing video element
            localVideoContainer.appendChild(view.target);
            localVideoContainer.hidden = false;
            console.log('Local video stream rendered.');
        }
    } catch (error) {
        console.error('Error displaying local video stream:', error);
        alert('Failed to display local video stream.');
    }
}

// Setup call event listeners
function setupCall(call) {
    currentCall = call;

    console.log(`Local participant is muted: ${call.isMuted}`);

    call.on('stateChanged', () => {
        console.log(`Call state changed to: ${call.state}`);
        if (call.state === 'Connected') {
            if (connectedLabel) {
                connectedLabel.hidden = false;
                console.log('Connected label shown.');
            }
        } else if (call.state === 'Disconnected') {
            if (connectedLabel) {
                connectedLabel.hidden = true;
                console.log('Connected label hidden.');
            }
            call = null;
            clearVideoContainers(); // Ensure media is cleared on disconnect
        }
    });

    call.on('isMutedChanged', () => {
        console.log(`Local participant mute state changed: ${call.isMuted}`);
        if (localAudioIndicator) {
            localAudioIndicator.textContent = call.isMuted ? 'Local Audio: Muted' : 'Local Audio: Unmuted';
        }
    });

    call.remoteParticipants.forEach(remoteParticipant => {
        subscribeToRemoteParticipant(remoteParticipant);
    });

    call.on('remoteParticipantsUpdated', e => {
        e.added.forEach(remoteParticipant => subscribeToRemoteParticipant(remoteParticipant));
        e.removed.forEach(() => console.log('Remote participant removed'));
    });
}

// Subscribe to remote participants and handle video streams
function subscribeToRemoteParticipant(remoteParticipant) {
    const remoteId = remoteParticipant.identifier.communicationUserId || remoteParticipant.identifier.id;
    console.log(`Remote participant added: ${remoteId}`);

    // Log if remote participant is muted
    console.log(`Remote participant is muted: ${remoteParticipant.isMuted}`);

    remoteParticipant.on('isMutedChanged', () => {
        console.log(`Remote participant mute state changed: ${remoteParticipant.isMuted}`);
    });

    remoteParticipant.videoStreams.forEach(remoteVideoStream => {
        subscribeToRemoteVideoStream(remoteVideoStream);
    });

    remoteParticipant.on('videoStreamsUpdated', e => {
        e.added.forEach(remoteVideoStream => {
            subscribeToRemoteVideoStream(remoteVideoStream);
        });
        e.removed.forEach(() => {
            console.log('Remote video stream removed');
        });
    });

    // Handle remote participant state changes
    remoteParticipant.on('stateChanged', () => {
        console.log(`Remote participant state changed to: ${remoteParticipant.state}`);
    });
}

// Subscribe to remote video streams
async function subscribeToRemoteVideoStream(remoteVideoStream) {
    if (remoteVideoStream.isAvailable) {
        await displayRemoteVideoStream(remoteVideoStream);
    }

    remoteVideoStream.on('isAvailableChanged', async () => {
        if (remoteVideoStream.isAvailable) {
            await displayRemoteVideoStream(remoteVideoStream);
        } else {
            console.log('Remote video stream is no longer available.');
        }
    });
}

// Display remote video stream
async function displayRemoteVideoStream(remoteVideoStream) {
    try {
        // Dispose of the previous renderer to prevent duplication
        if (remoteVideoRenderer) {
            remoteVideoRenderer.dispose();
            remoteVideoRenderer = null;
        }

        remoteVideoRenderer = new VideoStreamRenderer(remoteVideoStream);
        const view = await remoteVideoRenderer.createView();
        remoteVideosGallery.innerHTML = '';  // Clear existing video elements
        remoteVideosGallery.appendChild(view.target);
        console.log('Remote video stream rendered.');
    } catch (error) {
        console.error('Error rendering remote video stream:', error);
        alert('Failed to render remote video stream.');
    }
}

// Initialize Call Agent with extracted parameters
initializeCallAgent(token, userId, targetId, role);

// Expose functions to be callable from MAUI WebView
window.startCall = startCall;
window.hangUpCall = hangUpCall;
window.startVideo = startVideo;
window.stopVideo = stopVideo;
window.acceptCall = acceptCall;
window.declineCall = declineCall;
window.toggleMute = toggleMute;

window.isReady = true;  // Flag to notify MAUI WebView that the script is ready

// For testing in a browser environment
console.log('Script is ready.');

// Toggle Mute Function
let isMuted = false;

// Toggle mute state
async function toggleMute() {
    if (!call) {
        console.error('No active call to toggle mute.');
        alert('No active call to toggle mute.');
        return;
    }

    try {
        if (isMuted) {
            await call.unmute();
            console.log('Unmuted.');
        } else {
            await call.mute();
            console.log('Muted.');
        }
        isMuted = !isMuted;

        // Optionally, update local audio indicator
        if (localAudioIndicator) {
            localAudioIndicator.textContent = isMuted ? 'Local Audio: Muted' : 'Local Audio: Unmuted';
        }
    } catch (error) {
        console.error('Error toggling mute:', error);
        alert('Failed to toggle mute.');
    }
}

一旦它确实双向工作并且调用是完美的,但是只要重新运行它就没有改变任何东西,它又开始成为单向调用!

我感谢任何提醒、帮助或批评,我是这个 .NET MAUI 的新手,到目前为止我很喜欢它! 谢谢您的宝贵时间。

我尝试使用 Azure 通信服务进行 1:1 视频通话。我为音频实现了remoteAudioStream.getMediaStream()并订阅了远程参与者,但音频是单向的;被叫方可以听到主叫方的声音,但主叫方听不到被叫方的声音。尝试使用 deviceManager 选择麦克风和扬声器,检查静音状态,并使用 audioStreamsUpdated 事件处理音频流。预计双向音频。

还解决了问题:

  1. 麦克风选择:应用程序可能未使用正确的麦克风。
  2. 参与者订阅:正确订阅远程参与者可能会出现问题。
  3. 静音状态:其中一位参与者可能会被无意中静音。
  4. 设备权限:可能未正确授予麦克风访问权限。
  5. 音频设备管理:应用程序可能无法正确选择或管理音频设备。

我还使用合法的 SSL 在 HTTPS 上运行它

javascript .net webrtc maui azure-communication-services
1个回答
0
投票

麦克风选择:应用程序可能未使用正确的麦克风。

您提到了麦克风选择,但扬声器选择也会导致单向音频问题。检查您是否明确选择了正确的音频输出设备。根据平台的不同,您需要手动设置扬声器以检查其是否将音频路由到正确的设备(耳机、手机扬声器等)

const speakers = await deviceManager.getSpeakers();
const selectedSpeaker = speakers.find(speaker => speaker.name.includes('desired speaker'));
if (selectedSpeaker) {
  await deviceManager.selectSpeaker(selectedSpeaker);
  console.log(`Selected speaker: ${selectedSpeaker.name}`);
}
  • 如果您使用
    remoteAudioStream.getMediaStream()
    作为音频,请通过动态创建音频元素并将媒体流附加到其中来确保实际正在播放远程流。
async function playRemoteAudio(remoteAudioStream) {
  const mediaStream = await remoteAudioStream.getMediaStream();
  const audioElement = new Audio();
  audioElement.srcObject = mediaStream;
  audioElement.play().catch(error => {
    console.error('Error playing remote audio:', error);
  });
}

// Subscribe to remote participant and play audio when available
remoteParticipant.on('audioStreamsUpdated', (e) => {
  e.added.forEach(remoteAudioStream => {
    playRemoteAudio(remoteAudioStream);
  });
});

在这里,我们记录呼叫者和被呼叫者双方,并跟踪麦克风和扬声器选择、参与者订阅和音频流状态。请检查以下日志数据。

控制台日志:

[INFO] Initializing call setup...
[INFO] Fetching available microphones...
[INFO] Available microphones: 
    - Built-in Microphone
    - External USB Microphone
[INFO] Selected Microphone: Built-in Microphone

[INFO] Fetching available speakers...
[INFO] Available speakers: 
    - Built-in Speaker
    - Bluetooth Headphones
[INFO] Selected Speaker: Built-in Speaker

[INFO] Subscribing to remote participant...
[INFO] Participant added: Callee123

[INFO] Audio streams updated for participant Callee123
[INFO] New audio stream detected. Fetching media stream...

[INFO] Playing remote audio...
[INFO] Remote media stream successfully attached to audio element.
[DEBUG] Remote stream active: True
[DEBUG] Playing state: True
[INFO] Audio is playing from remote participant.

[INFO] Call setup complete.
© www.soinside.com 2019 - 2025. All rights reserved.