Boemo is a software developer who embraces innovative approaches. He likes diving deep into complex concepts in order to learn and write articles that can help the reader understand complex methodologies in a simple and fun way.
The Agora SDK has a play audio feature. You can use this feature to play songs, voice recordings, and sound effects for entertainment or educational purposes during a video call.
In this tutorial, you will learn how to play audio files during a video call using the Agora SDK in Android.
Prerequisites and requirements
- An Agora developer account (see How to Get Started with Agora).
- Knowledge of how to create a live-streaming Android application using Agora. Check out the tutorial here.
- Basic knowledge of Android development.
- Android Studio.
- An Android device.
Adding dependencies in Gradle
Add the following dependencies in your build.gradle file in the app module, and sync to download the Agora third-party library. Ensure that you always use the latest Agora library version:
implementation 'com.yanzhenjie:permission:2.0.3' | |
implementation 'io.agora.rtc:full-sdk:3.5.0' |
Adding permissions in the Manifest.xml file
Add the following permissions in the Manifest.xml file:
<uses-permission android:name="android.permission.CAMERA" /> | |
<uses-permission android:name="android.permission.INTERNET" /> | |
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" /> | |
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> | |
<uses-permission android:name="android.permission.RECORD_AUDIO" /> |
Creating an instance of the RtcEngine
Now, let’s create an RtcEngine instance by initializing the RtcEngine and passing the IRtcEngineEventHandler and your App ID to the create method. IRtcEngineEventHandler is an abstract class that provides the default implementation:
try { | |
mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler); | |
} catch (Exception e) { | |
Log.e(LOG_TAG, Log.getStackTraceString(e)); | |
throw new RuntimeException("fatal error\n" + Log.getStackTraceString(e)); | |
} |
Loading the audio file before the user plays it
To make the audio file processing fast, you have to load the audio file in advance. The preloadEffect() function takes in the audio ID and the audio file path. This function will load the selected audio file in advance. This function should be called before the user joins the video call:
// Gets the global audio effect manager. | |
audioEffectManager = engine.getAudioEffectManager(); | |
int id = 0; | |
//add the file path for the audio you want to play | |
audioEffectManager.preloadEffect(id++, Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3" | |
); |
The playEffect() function plays the audio file passed in the arguments. The playEffect() function takes in the following parameters:
- soundId: The sound ID of the audio effect file to be played.
- filePath: The file path of the audio effect file.
- loopCount: The number of playback loops.
- Pitch: The pitch of the audio effect.
- pan: Sets the spatial position of the effect.
- volume: Sets the volume percentage.
Next, call the playEffect() function to play the audio file in advance:
//play the effect and limit the size of the file | |
audioEffectManager.playEffect( | |
0, | |
Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3", -1, 0.0, 100, true, 0 | |
); | |
// Pauses all audio effects. | |
audioEffectManager.pauseAllEffects(); |
Agora Android SDK supports only the following audio file types:
- MP3
- AAC
- M4A
- 3GP
- WAV
Adjusting the volume
You can set the volume percentage of the video call audio by calling the adjustPlaybackSignalVolume() function and passing the volume value/percentage. By default, the video call volume is set to 100. Use this function to adjust the volume of the audio file being played:
mRtcEngine.adjustPlaybackSignalVolume(55); |
Playing audio
The following code fetches the audio file path and plays the audio when the user presses the play button:
engine.startAudioMixing("add your file path here", false, false, -1, 0); |
Stop playing
Use the stopAudioMixing() function to stop the audio when the stop button is pressed.
engine.stopAudioMixing(); |
Resume playing
The resumeAudioMixing() function resumes the audio being played.
engine.resumeAudioMixing(); |
Integrating the play audio feature with the Agora Video Call SDK
You now know how to play audio files using the Agora SDK methods. The following code block shows you how to integrate the Agora play audio feature in a video streaming application:
public class MainActivity extends AppCompatActivity{ | |
private RtcEngine mRtcEngine; | |
private IAudioEffectManager audioEffectManager; | |
// Permissions | |
private static final int PERMISSION_REQ_ID = 22; | |
private static final String[] REQUESTED_PERMISSIONS = {Manifest.permission.RECORD_AUDIO, Manifest.permission.CAMERA}; | |
private static final String LOG_TAG = MainActivity.class.getSimpleName(); | |
// Handle SDK Events | |
private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() { | |
@Override | |
public void onUserJoined(final int uid, int elapsed) { | |
runOnUiThread(new Runnable() { | |
@Override | |
public void run() { | |
// set first remote user to the main bg video container | |
setupRemoteVideoStream(uid); | |
} | |
}); | |
} | |
// remote user has left channel | |
@Override | |
public void onUserOffline(int uid, int reason) { // Tutorial Step 7 | |
runOnUiThread(new Runnable() { | |
@Override | |
public void run() { | |
onRemoteUserLeft(); | |
} | |
}); | |
} | |
// remote user has toggled their video | |
@Override | |
public void onRemoteVideoStateChanged(final int uid, final int state, int reason, int elapsed) { | |
runOnUiThread(new Runnable() { | |
@Override | |
public void run() { | |
onRemoteUserVideoToggle(uid, state); | |
} | |
}); | |
} | |
}; | |
private void preloadAudioEffect(){ | |
// Gets the global audio effect manager. | |
audioEffectManager = mRtcEngine.getAudioEffectManager(); | |
int id = 0; | |
audioEffectManager.preloadEffect(id++, Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3"); | |
audioEffectManager.playEffect( | |
0, | |
Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3", | |
-1, | |
1, | |
0.0, | |
100, | |
true, | |
0 | |
); | |
// Pauses all audio effects. | |
audioEffectManager.pauseAllEffects(); | |
} | |
@Override | |
protected void onCreate(Bundle savedInstanceState) { | |
super.onCreate(savedInstanceState); | |
setContentView(R.layout.activity_main); | |
findViewById(R.id.bass).setVisibility(View.GONE); // set the join button hidden | |
findViewById(R.id.beautify).setVisibility(View.GONE); // set the join button hidden | |
if (checkSelfPermission(REQUESTED_PERMISSIONS[0], PERMISSION_REQ_ID) && | |
checkSelfPermission(REQUESTED_PERMISSIONS[1], PERMISSION_REQ_ID)) { | |
initAgoraEngine(); | |
} | |
} | |
private void initAgoraEngine() { | |
try { | |
mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler); | |
preloadAudioEffect(); | |
} catch (Exception e) { | |
Log.e(LOG_TAG, Log.getStackTraceString(e)); | |
throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e)); | |
} | |
setupSession(); | |
} | |
private void setupSession() { | |
mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_COMMUNICATION); | |
mRtcEngine.enableVideo(); | |
mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration(VideoEncoderConfiguration.VD_640x480, VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_30, | |
VideoEncoderConfiguration.STANDARD_BITRATE, | |
VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT)); | |
} | |
private void setupLocalVideoFeed() { | |
// setup the container for the local user | |
FrameLayout videoContainer = findViewById(R.id.floating_video_container); | |
SurfaceView videoSurface = RtcEngine.CreateRendererView(getBaseContext()); | |
videoSurface.setZOrderMediaOverlay(true); | |
videoContainer.addView(videoSurface); | |
mRtcEngine.setupLocalVideo(new VideoCanvas(videoSurface, VideoCanvas.RENDER_MODE_FIT, 0)); | |
} | |
private void setupRemoteVideoStream(int uid) { | |
// setup ui element for the remote stream | |
FrameLayout videoContainer = findViewById(R.id.bg_video_container); | |
// ignore any new streams that join the session | |
if (videoContainer.getChildCount() >= 1) { | |
return; | |
} | |
SurfaceView videoSurface = RtcEngine.CreateRendererView(getBaseContext()); | |
videoContainer.addView(videoSurface); | |
mRtcEngine.setupRemoteVideo(new VideoCanvas(videoSurface, VideoCanvas.RENDER_MODE_FIT, uid)); | |
mRtcEngine.setRemoteSubscribeFallbackOption(io.agora.rtc.Constants.STREAM_FALLBACK_OPTION_AUDIO_ONLY); | |
} | |
// join the channel when user clicks UI button | |
public void onjoinChannelClicked(View view) { | |
mRtcEngine.joinChannel(null, "test-channel", "Extra Optional Data", 0); // if you do not specify the uid, Agora will assign one. | |
setupLocalVideoFeed(); | |
findViewById(R.id.joinBtn).setVisibility(View.GONE); // set the join button hidden | |
findViewById(R.id.bass).setVisibility(View.VISIBLE); // set the join button hidden | |
findViewById(R.id.beautify).setVisibility(View.VISIBLE); // set the join button hidden | |
} | |
private void leaveChannel() { | |
mRtcEngine.leaveChannel(); | |
} | |
private void removeVideo(int containerID) { | |
FrameLayout videoContainer = findViewById(containerID); | |
videoContainer.removeAllViews(); | |
} | |
private void onRemoteUserVideoToggle(int uid, int state) { | |
FrameLayout videoContainer = findViewById(R.id.bg_video_container); | |
SurfaceView videoSurface = (SurfaceView) videoContainer.getChildAt(0); | |
videoSurface.setVisibility(state == 0 ? View.GONE : View.VISIBLE); | |
// add an icon to let the other user know remote video has been disabled | |
if(state == 0){ | |
ImageView noCamera = new ImageView(this); | |
noCamera.setImageResource(R.drawable.video_disabled); | |
videoContainer.addView(noCamera); | |
} else { | |
ImageView noCamera = (ImageView) videoContainer.getChildAt(1); | |
if(noCamera != null) { | |
videoContainer.removeView(noCamera); | |
} | |
} | |
} | |
private void onRemoteUserLeft() { | |
removeVideo(R.id.bg_video_container); | |
} | |
public boolean checkSelfPermission(String permission, int requestCode) { | |
Log.i(LOG_TAG, "checkSelfPermission " + permission + " " + requestCode); | |
if (ContextCompat.checkSelfPermission(this, | |
permission) | |
!= PackageManager.PERMISSION_GRANTED) { | |
ActivityCompat.requestPermissions(this, | |
REQUESTED_PERMISSIONS, | |
requestCode); | |
return false; | |
} | |
return true; | |
} | |
@Override | |
public void onRequestPermissionsResult(int requestCode, | |
@NonNull String permissions[], @NonNull int[] grantResults) { | |
super.onRequestPermissionsResult(requestCode, permissions, grantResults); | |
Log.i(LOG_TAG, "onRequestPermissionsResult " + grantResults[0] + " " + requestCode); | |
switch (requestCode) { | |
case PERMISSION_REQ_ID: { | |
if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) { | |
Log.i(LOG_TAG, "Need permissions " + Manifest.permission.RECORD_AUDIO + "/" + Manifest.permission.CAMERA); | |
break; | |
} | |
initAgoraEngine(); | |
break; | |
} | |
} | |
} | |
@Override | |
protected void onDestroy() { | |
super.onDestroy(); | |
leaveChannel(); | |
RtcEngine.destroy(); | |
mRtcEngine = null; | |
} | |
public void playAudio (View w){ | |
mRtcEngine.startAudioMixing(Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3", | |
false, false, -1, 0); | |
//adjusting the volume | |
mRtcEngine.adjustAudioMixingVolume(90); | |
Toast.makeText(getApplicationContext(), | |
"just played the song", | |
Toast.LENGTH_LONG); | |
} | |
public void stopAudio (View v){ | |
mRtcEngine.stopAudioMixing(); | |
Toast.makeText(getApplicationContext(), | |
"stopped playing music ", | |
Toast.LENGTH_LONG); | |
} | |
} |
If you are not familiar with building a one-to-one video call app using Agora SDK, check this tutorial on GitHub written by Hermes. The above code uses the same concepts taught in that tutorial.
Testing the app demo
Summary
In this tutorial, you have learned how to use the Agora SDK to:
- Preload audio
- Play and pause audio
- Adjust the audio volume
Conclusion
Hooray, you now know how to play audio files during a video call using the Agora SDK!
Thank you for reading. You can learn more on how to play audio using Agora SDK here, and you can check out more Agora features on GitHub here. If you want to copy or reference the SDK I was using, check it on Github here.
Other Resources
If you are getting confused in the process, you can check out Agora’s documentation. You can also join Agora’s Slack channel here.