Skip to main content
Android
iOS
Web
macOS
Windows
Flutter
React Native

Audio extension

The audio filters you created are easily integrated into apps to supply your voice effects and noise cancellation.

Understand the tech

An audio filter accesses voice data when it is captured from the user's local device, modifies it, then plays the updated data to local and remote video channels.

img

A typical transmission pipeline consists of a chain of procedures, including capture, pre-processing, encoding, transmitting, decoding, post-processing, and play. In order to modify the voice or video data in the transmission pipeline, audio extensions are inserted into either the pre-processing or post-processing procedure.

Prerequisites

In order to follow this procedure you must have:

  • Android Studio 4.1 or higher.
  • Android SDK API Level 24 or higher.
  • A mobile device that runs Android 4.1 or higher.
  • A project to develop in.

Project setup

In order to integrate an extension into your project:

To integrate into your project:

  1. Unzip Video SDK to a local directory.
  2. Copy the header files in rtc/sdk/low_level_api/include under the directory of your project file.

You are now ready to develop your extension.

Create an audio extension

To create an extension in your Agora project, use the following APIs:

  • IAudioFilter: Implement receiving, processing, and delivering audio data.
  • IExtensionProvider: encapsulate your IAudioFilter implementation into an extension.

Develop an audio filter

Use the IAudioFilter interface to implement an audio filter. You can find the interface in the NGIAgoraMediaNode.h file. You need to implement this interface first, and you must implement at least the following methods from this interface:

The following code sample shows how to use these methods together to implement an audio filter extension:


_34
// After receiving the audio frames to be processed, call adaptAudioFrame to process the audio frames.
_34
bool ExtensionAudioFilter::adaptAudioFrame(const media::base::AudioPcmFrame &inAudioPcmFrame,
_34
media::base::AudioPcmFrame &adaptedPcmFrame) {
_34
return audioProcess_->processFrame(inAudioPcmFrame, adaptedPcmFrame) == 0;
_34
}
_34
_34
// Call setProperty to set the property of the audio filter.
_34
int ExtensionAudioFilter::setProperty(const char* key, const void* buf, int buf_size) {
_34
std::string str_volume = "100";
_34
if (std::string(key) == "volume") {
_34
str_volume = std::string(static_cast<const char*>(buf), buf_size);
_34
}
_34
_34
int int volume_ = atoi(str_volume.c_str());
_34
audioProcessor_->setVolume(int_volume_);
_34
return ERR_OK;
_34
}
_34
_34
// Call getProperty to get the property of the audio filter.
_34
int ExtensionAudioFilter::getProperty(const char* key, void* buf, int buf_size) const override {return ERR_OK; }
_34
_34
// Call setEnabled to enable the audio filter.
_34
void ExtensionAudioFilter::setEnabled(bool enable) override { enabled_ = enable; }
_34
// Call isEnabled to check whether the audio filter is enabled.
_34
bool ExtensionAudioFilter::isEnabled() const override {return enabled_; }
_34
_34
// Set the vendor name in the return value of getName.
_34
const char* getName() const override { return filterName_.c_str(); }
_34
_34
// (Optional) Specify the audio sample rate in the return value of getPreferredSampleRate
_34
const char* ExtensionAudioFilter::getPreferredSampleRate() override { return 48000; }
_34
_34
// (Optional) Specify the number of audio channels in the return value of getPreferredChannelNumbers
_34
int ExtensionAudioFilter::getPreferredChannelNumbers() override { return 2; };

Encapsulate the filter into an extension

To encapsulate the audio filter into an extension, you need to implement the IExtensionProvider interface. You can find the interface in the NGIAgoraExtensionProvider.h file. The following methods from this interface must be implemented:

The following code sample shows how to use these methods to encapsulate the audio filter:


_25
void ExtensionProvider::enumerateExtensions(ExtensionMetaInfo* extension_list,
_25
int& extension_count) {
_25
extension_count = 2;
_25
//Declare a Video Filter, and IExtensionProvider::createVideoFilter will be called
_25
ExtensionMetaInfo i;
_25
i.type = EXTENSION_TYPE::VIDEO_PRE_PROCESSING_FILTER;
_25
i.extension_name = agora::extension::VIDEO_FILTER_NAME;
_25
extension_list[0] = i;
_25
_25
//Declare an Audio Filter, and IExtensionProvider::createAudioFilter will be called
_25
ExtensionMetaInfo j;
_25
j.type = EXTENSION_TYPE::AUDIO_FILTER;
_25
j.extension_name = agora::extension::AUDIO_FILTER_NAME;
_25
extension_list[1] = j;
_25
}
_25
_25
agora_refptr<agora::rtc::IAudioFilter> ExtensionProvider::createAudioFilter(const char* name) {
_25
PRINTF_INFO("ExtensionProvider::createAudioFilter %s", name);
_25
auto audioFilter = new agora::RefCountedObject<agora::extension::ExtensionAudioFilter>(name, audioProcessor_);
_25
return audioFilter;
_25
}
_25
_25
void ExtensionAudioProvider::setExtensionControl(rtc::IExtensionControl* control){
_25
audioProcessor_->setExtensionControl(control);
_25
}

Package the extension

After encapsulating the filter into an extension, you need to register and package it into a .aar or .so file, and submit it together with a file that contains the extension name, vendor name and filter name to Agora.

  1. Register the extension

    Register the extension with the macro REGISTER_AGORA_EXTENSION_PROVIDER, which is in the AgoraExtensionProviderEntry.h file. Use this macro at the entrance of the extension implementation. When the SDK loads the extension, this macro automatically registers it to the SDK. For example:


    _1
    REGISTER_AGORA_EXTENSION_PROVIDER(ByteDance, agora::extension::ExtensionProvider);

  2. Link the libagora-rtc-sdk-jni.so file

    In CMakeLists.txt, specify the path to save the libagora-rtc-sdk-jni.so file in the downloaded SDK package according to the following table:

    FilePath
    64-bit libagora-rtc-sdk-jni.soAgoraWithByteDanceAndroid/agora-bytedance/src/main/agoraLibs/arm64-v8a
    32-bit libagora-rtc-sdk-jni.soAgoraWithByteDanceAndroid/agora-bytedance/src/main/agoraLibs/arm64-v7a
  3. Provide extension information

    Create a .java or .md file to provide the following information:

    • EXTENSION_NAME: The name of the target link library used in CMakeLists.txt. For example, for a .so file named libagora-bytedance.so, the EXTENSION_NAME should be agora-bytedance.
    • EXTENSION_VENDOR_NAME: The name of the extension provider, which is used for registering in the agora-bytedance.cpp file.
    • EXTENSION_FILTER_NAME: The name of the filter, which is defined in ExtensionProvider.h.

Test your implementation

To ensure that you have integrated the extension in your app:

Once you have developed your extension and API endpoints, the next step is to test whether they work properly.

  • Functional and performance tests

    Test the functionality and performance of your extension and submit a test report to Agora. This report must contain:

    • The following proof of functionality:
      • The extension is enabled and loaded in the SDK normally.
      • All key-value pairs in the setExtensionProperty or setExtensionPropertyWithVendor method work properly.
      • All event callbacks of your extension work properly through IMediaExtensionObserver.
    • The following performance data:
      • The average time the extension needs to process an audio or video frame.
      • The maximum amount of memory required by the extension.
      • The maximum amount of CPU/GPU consumption required by the extension.
  • Extension listing test The Extensions Marketplace is where developers discover your extension. In the Marketplace, each extension has a product listing that provides detailed information such as feature overview and implementation guides. Before making your extension listing publicly accessible, the best practice is to see how everything looks and try every function in a test environment.

  • Write the integration document for your extension

    The easier it is for other developers to integrate your extension, the more it will be used. Follow the guidelines and create the best Integration guide for your extension

  • Apply for testing

    To apply for access to the test environment, contact Agora and provide the following:

    • Your extension package
    • Extension listing assets, including:
    • Your company name
    • Your public email address
    • The Provisioning API endpoints
    • The Usage and Billing API endpoints
    • Your draft business model or pricing plan
    • Your support page URL
    • Your official website URL
    • Your implementation guides URL
  • Test your extension listing

    Once your application is approved, Agora publishes your extension in the test environment and sends you an e-mail.

    To test if everything works properly with your extension in the Marketplace, do the following:

    • Activate and deactivate your extension in an Agora project, and see whether the Provisioning APIs work properly.
    • Follow your implementation guides to implement your extension in an Agora project, and see whether you need to update your documentation.
    • By the end of the month, check the billing information and see whether the Usage and Billing APIs work properly.

Now you are ready to submit your extension for final review by Agora. You can now Publish Your Extension.

Reference

This section contains content that completes the information on this page, or points you to documentation that explains other aspects to this product.

Sample project

Agora provides an Android sample project agora-simple-filter for developing audio and video filter extensions.

API reference

The classes used to create and encapsulate filters are:

  • IAudioFilter: Implement receiving, processing, and delivering audio data.
  • IExtensionProvider: encapsulate your IAudioFilter implementation into an extension.

IAudioFilter

Implement receiving, processing, and delivering audio data.

Methods include:

adaptAudioFrame

Adapts the audio frame. This is the core method of the IAudioFilter interface. By calling this method, the SDK processes audio frames from inAudioFrame and returns the adapted frames with adaptedFrame. This method supports audio data in the PCM format only.


_2
virtual bool adaptAudioFrame(const media::base::AudioPcmFrame& inAudioFrame,
_2
media::base::AudioPcmFrame& adaptedFrame) = 0;

ParameterDescription
inAudioFrameAn input parameter. The pointer to the audio frames to be processed.
adaptedFrameAn output parameter. The pointer to the processed audio frames.

setEnabled

Enables or disables the audio filter.


_1
virtual void setEnabled(bool enable) {}

ParameterDescription
enableWhether to enable the audio filter:
  • true: Enable the audio filter.
  • false: (Default) Disable the audio filter.
  • isEnabled

    Checks whether the audio filter is enabled.


    _1
    virtual bool isEnabled() { return true; }

    Returns Whether the audio filter is enabled:

    • true: The audio filter is enabled.
    • false: (Default) The audio filter is disabled.

    setProperty

    Sets the property of the audio filter. When an app client calls setExtensionProperty, the SDK triggers this callback. In the callback, you need to return the property of the audio filter.


    _1
    size_t setProperty(const char* key, const void* buf, size_t buf_size)

    ParameterDescription
    keyThe key of the property.
    bufThe buffer of the property in the JSON format. You can use the open source nlohmann/json library for the serialization and deserialization between the C++ struct and the JSON string.
    buf_sizeThe size of the buffer.

    getProperty

    Gets the property of the audio filter. When the app client calls getExtensionProperty, the SDK calls this method to get the property of the audio filter.


    _1
    size_t getProperty(const char* key, char* property, size_t buf_size)

    ParameterDescription
    keyThe key of the property.
    propertyThe pointer to the property.
    buf_sizeThe size of the buffer.

    getName

    Retrieves the vendor name. You need to set the VENDOR_NAME in the return value of this method.


    _1
    virtual const char * getName() const = 0;

    getPreferredSampleRate

    Retrieves the preferred sample rate of the audio filter.

    This method is optional. If you specify a sample rate in the return value of this method, the SDK resamples the audio data accordingly before sending it to the audio filter.


    _1
    virtual int getPreferredSampleRate() { return 0; };

    getPreferredChannelNumbers

    Retrieves the preferred number of channels of the audio filter.

    This method is optional. If you specify a number in the return value of this method, the SDK resamples the audio data accordingly before sending it to the audio filter.


    _1
    virtual int getPreferredChannelNumbers() { return 0; };

    IExtensionProvider

    Encapsulate your IAudioFilter implementation into an extension.

    Methods include:

    enumerateExtensions

    Enumerates your extensions that can be encapsulated. The SDK triggers this callback when loading the extension. In the callback, you need to return information about all of your extensions that can be encapsulated.


    _5
    virtual void enumerateExtensions(ExtensionMetaInfo* extension_list,
    _5
    int& extension_count) {
    _5
    (void) extension_list;
    _5
    extension_count = 0;
    _5
    }

    ParameterDescription
    extension_listExtension information, including extension type and name. For details, see the definition of ExtensionMetaInfo.
    extension_countThe total number of the extensions that can be encapsulated.

    The definition of ExtensionExtensionMetaInfo is as follows:


    _21
    // EXTENSION_TYPE represents where the extension is located in the media transmission pipeline
    _21
    enum EXTENSION_TYPE {
    _21
    // Audio processing filter
    _21
    AUDIO_FILTER,
    _21
    // Video preprocessing filter
    _21
    VIDEO_PRE_PROCESSING_FILTER,
    _21
    // Video postprocessing filter
    _21
    VIDEO_POST_PROCESSING_FILTER,
    _21
    // Reserved for future use
    _21
    AUDIO_SINK,
    _21
    // Reserved for future use
    _21
    VIDEO_SINK,
    _21
    // Reserved for future use
    _21
    UNKNOWN,
    _21
    };
    _21
    _21
    // Extension information, including extension type and name
    _21
    struct ExtensionMetaInfo {
    _21
    EXTENSION_TYPE type;
    _21
    const char* extension_name;
    _21
    };

    If you specify AUDIO_FILTER as EXTENSION_TYPE, after the customer creates the IExtensionProvider object when initializing RtcEngine, the SDK calls the createAudioFilter method, and you need to return the IAudioFilter instance in this method.

    createAudioFilter

    Creates an audio filter. You need to pass the IAudioFilter instance in this method.


    _1
    virtual agora_refptr<IAudioFilter> createAudioFilter()

    After creating an audio filter object, the extension processes the input audio frames with methods in IAudioFilter.

    setExtensionControl

    Sets the extension control.


    _1
    virtual void setExtensionControl(IExtensionControl* control)

    After calling this method, you need to maintain the IExtensionControl object returned in this method. The IExtensionControl object manages the interaction between the extension and the SDK by triggering callbacks and sending logs. For example, if you have called fireEvent in IExtensionControl:


    _5
    void ByteDanceProcessor::dataCallback(const char* data){
    _5
    if (control_ != nullptr) {
    _5
    control_->fireEvent(id_, "beauty", data);
    _5
    }
    _5
    }

    And if the app registers the IMediaExtensionObserver class when initializing RtcEngine, the SDK triggers the following callback on the app client:


    _4
    @Override
    _4
    public void onEvent(String vendor, String key, String value) {
    _4
    ...
    _4
    }

    vundefined