1.首先我们来看一张ffmpeg的大致工作流程图:
ps:图片转载叶余,特别再次感谢。
可以看到,示意图示意的是将手机上的flv格式的已封装好的视频首先进行解复用,分离出视频流和音频流,他们被ffmpeg封装为paclass="superseo">cket数据包。接下来分别通过音视频的解码器进行解码,分别生成音视频的frame数据,其实可以简单理解为yuv和pcm原始流数据。在这个时候其实是可以进行数据的二次加工,例如添加水印,变声等功能。在这里我们不做任何处理,我们可以直接输出声音,视频流到设备显示或者是重新通过编码器编码保存到本地或者是推送到远端推流服务器。
接下来就让我开始一步步实现推流器的实现吧!
1.创建android c++工程
ps:需要在sdk manager中下载ndk以及配套调试LLDB
2.引入ffmpeg so库
新建libs文件,将对应框架的include(头文件),动态链接库放入到libs文件夹中
3.编写cmake.txt
# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html
# Sets the minimum version of CMake required to build the native library.
cmake_minimum_required(VERSION 3.10.2)
# Declares and names the project.
project("beipush")
# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.
MESSAGE("默认路径:"+ ${PROJECT_SOURCE_DIR})
set(cpp_dir ${PROJECT_SOURCE_DIR}/../../../src/main/cpp)
file(GLOB cpp_src ${cpp_dir}/*.cpp ${cpp_dir}/*.h)
MESSAGE("cpp_dir:"+ ${cpp_src})
add_library( # Sets the name of the library.
native-lib
# Sets the library as a shared library.
SHARED
# Provides a relative path to your source file(s).//将刚刚设置的c源码目录添加编译
${cpp_src})
#添加库的头文件.h
set(LIB_PATH ${PROJECT_SOURCE_DIR}/../../../libs)
MESSAGE("路径==" ${LIB_PATH})
include_directories(${LIB_PATH}/include)
set(DIR ${LIB_PATH}/${ANDROID_ABI})
MESSAGE("路径==" ${LIB_PATH})
MESSAGE("路径==" ${DIR})
# Searches for a specified prebuilt library and stores the path as a
# variable. Because CMake includes system libraries in the search path by
# default, you only need to specify the name of the public NDK library
# you want to add. CMake verifies that the library exists before
# completing its build.
# 开始添加FFmpeg的相关库的依赖
# 添加编码库
# 格式其实跟cpp相似
add_library(avcodec
SHARED
IMPORTED)
set_target_properties(
avcodec
PROPERTIES IMPORTED_LOCATION
${DIR}/libavcodec.so
)
# 添加各种设备的输入输出的库
add_library(avdevice
SHARED
IMPORTED)
set_target_properties(
avdevice
PROPERTIES IMPORTED_LOCATION
${DIR}/libavdevice.so
)
# 添加滤镜特效处理库
add_library(avfilter
SHARED
IMPORTED)
set_target_properties(
avfilter
PROPERTIES IMPORTED_LOCATION
${DIR}/libavfilter.so
)
# 添加封装格式处理库
add_library(avformat
SHARED
IMPORTED)
set_target_properties(
avformat
PROPERTIES IMPORTED_LOCATION
${DIR}/libavformat.so
)
# 添加工具库
add_library(avutil
SHARED
IMPORTED)
set_target_properties(
avutil
PROPERTIES IMPORTED_LOCATION
${DIR}/libavutil.so
)
# 添加音频采样数据格式转换库
add_library(swresample
SHARED
IMPORTED)
set_target_properties(
swresample
PROPERTIES IMPORTED_LOCATION
${DIR}/libswresample.so
)
# 添加音频采样数据格式转换库
add_library(swscale
SHARED
IMPORTED)
set_target_properties(
swscale
PROPERTIES IMPORTED_LOCATION
${DIR}/libswscale.so
)
add_library(postproc
SHARED
IMPORTED)
set_target_properties(
postproc
PROPERTIES IMPORTED_LOCATION
${DIR}/libpostproc.so
)
find_library( # Sets the name of the path variable.
log-lib
# Specifies the name of the NDK library that
# you want CMake to locate.
log)
# Specifies libraries CMake should link to your target library. You
# can link multiple libraries, such as libraries you define in this
# build script, prebuilt third-party libraries, or system libraries.
#链接动态库
target_link_libraries( # Specifies the target library.
native-lib
avcodec
avdevice
avfilter
avformat
avutil
swresample
swscale
postproc
# Links the target library to the log library
# included in the NDK.
android
${log-lib})
以及对应的cpp目录,大家可以对应参考。
注:这里的cpp文件夹中正常来说应该只有默认的native-lib和CMakeLists.txt文件,这里其余的文件是其他项目功能,不做理会。
4.配置app.gradle
到这里基本上集成就已经集成成功了,那么该如何验证呢?
5.hello ffmpeg,打印ffmpeg编译信息
我们可以通过修改默认的ndk工程中
stringFromJNI
jni:
这里我们只是打印”hello ffmpeg“+ffmpeg的详细编译信息
java:
java的代码都是默认生成的,不需要修改,直接点击运行。
顺利的话你应该看到模拟器d出:
可以看到,这里我们是编译的ffmpeg-3.4.8版本,并且开启了libx264和libfdk-aac。
PS:别忘记,如果想让模拟器正常使用,需要引入x86架构,相信你从截图中也可以清晰的
看到x86的字样。
至此,我们就成功引入了ffmpeg到我的android ndk工程中了。
注:欢迎大家在评论区给我提问,知无不言言无不尽。
完结撒花...
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)