Merge branch 'master' into dev

This commit is contained in:
xiongguangjie 2024-06-13 14:56:11 +08:00
commit 6153dc43e8
110 changed files with 1011 additions and 2133 deletions

View File

@ -20,14 +20,14 @@ assignees: ''
* https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax
-->
### 现象描述
## 现象描述
<!--
在使用什么功能产生的问题? 其异常表现是什么?
如: 在测试 WebRTC 功能时, 使用 Chrome 浏览器访问 ZLMediait 自带网页播放 FFmpeg 以 RTSP 协议推送的图像有卡顿/花屏.
-->
### 如何复现?
## 如何复现?
<!--
明确的复现步骤对快速解决问题极有帮助.
@ -37,7 +37,7 @@ assignees: ''
1. 期望 ..., 结果 ...
-->
### 相关日志或截图
## 相关日志或截图
<!--
由于日志通长较长, 建议将日志信息填写到下面的 "日志内容..."
@ -50,11 +50,14 @@ assignees: ''
<details>
<summary>展开查看详细日志</summary>
<pre>
日志内容...
```
#详细日志粘在这里!
```
</pre>
</details>
### 配置
## 配置
<!--
部分常见问题是由于配置错误导致的, 建议仔细阅读配置文件中的注释信息
@ -65,11 +68,14 @@ assignees: ''
<details>
<summary>展开查看详细配置</summary>
<pre>
配置内容...
```ini
#config.ini内容粘在这里!
```
</pre>
</details>
### 各种环境信息
## 各种环境信息
<!--
请填写相关环境信息, 详细的环境信息有助于快速复现定位问题.
@ -82,4 +88,8 @@ assignees: ''
* **代码提交记录/git commit hash**:
* **操作系统及版本**:
* **硬件信息**:
* **crash backtrace**:
```
#崩溃信息backtrace粘贴至此
```
* **其他需要补充的信息**:

View File

@ -20,7 +20,7 @@ assignees: ''
* https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax
-->
### 相关日志及环境信息
## 相关日志及环境信息
<!--
由于编译日志通长较长, 建议将日志信息填写到下面 `````` block 内,或者上传日志文件
@ -41,7 +41,7 @@ assignees: ''
编译目录下的 `CMakeCache.txt` 文件内容,请直接上传为附件。
### 各种环境信息
## 各种环境信息
<!--
请填写相关环境信息, 详细的环境信息有助于快速复现定位问题.

6
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@ -0,0 +1,6 @@
blank_issues_enabled: false
contact_links:
- name: 技术咨询
url: https://t.zsxq.com/FcVK5
about: 请在知识星球发起技术咨询

View File

@ -7,8 +7,8 @@ assignees: ''
---
**描述该功能的用处,可以提供相关资料描述该功能**
## 描述该功能的用处,可以提供相关资料描述该功能
**该功能是否用于改进项目缺陷,如果是,请描述现有缺陷**
## 该功能是否用于改进项目缺陷,如果是,请描述现有缺陷
**描述你期望实现该功能的方式和最终效果**
## 描述你期望实现该功能的方式和最终效果

View File

@ -1,19 +0,0 @@
---
name: 技术咨询
about: 使用咨询、技术咨询等
title: "[技术咨询] 咨询描述(必填)"
labels: 技术咨询
assignees: ''
---
**咨询的功能模块**
- 请描述您想咨询zlmediakit的哪部分功能
**咨询的具体内容和问题**
- 此处展开您咨询内容的描述
**注意事项**
- 技术咨询前请先认真阅读readme, [wiki](https://github.com/ZLMediaKit/ZLMediaKit/wiki),如有必要您也可以同时搜索已经答复的issue如果没找到答案才在此提issue
- 技术咨询不属于bug缺陷要求用户先star(收藏)本项目否则会直接关闭issue

View File

@ -1,8 +0,0 @@
---
name: issue创建要求
about: 不符合模板要求不便定位问题,可能会被管理员直接关闭
title: ""
labels: ''
assignees: ''
---

View File

@ -23,3 +23,35 @@ jobs:
- name: 编译
run: cd Android && ./gradlew build
- name: 设置环境变量
run: |
echo "BRANCH=$(echo ${GITHUB_REF#refs/heads/} | tr -s "/\?%*:|\"<>" "_")" >> $GITHUB_ENV
echo "BRANCH2=$(echo ${GITHUB_REF#refs/heads/} )" >> $GITHUB_ENV
echo "DATE=$(date +%Y-%m-%d)" >> $GITHUB_ENV
- name: 打包二进制
id: upload
uses: actions/upload-artifact@v4
with:
name: ${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}
path: Android/app/build/outputs/apk/debug/*
if-no-files-found: error
retention-days: 90
- name: issue评论
if: github.event_name != 'pull_request' && github.ref != 'refs/heads/feature/test'
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
github.rest.issues.createComment({
issue_number: 483,
owner: context.repo.owner,
repo: context.repo.repo,
body: '- 下载地址: [${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}](${{ steps.upload.outputs.artifact-url }})\n'
+ '- 分支: ${{ env.BRANCH2 }}\n'
+ '- git hash: ${{ github.sha }} \n'
+ '- 编译日期: ${{ env.DATE }}\n'
+ '- 打包ci名: ${{ github.workflow }}\n'
})

View File

@ -7,12 +7,6 @@ on:
- "feature/*"
- "release/*"
pull_request:
branches:
- "master"
- "feature/*"
- "release/*"
env:
# Use docker.io for Docker Hub if empty
REGISTRY: docker.io
@ -21,7 +15,7 @@ env:
jobs:
build:
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
permissions:
contents: read
packages: write
@ -39,7 +33,6 @@ jobs:
# Install the cosign tool except on PR
# https://github.com/sigstore/cosign-installer
- name: Install cosign
if: github.event_name != 'pull_request'
uses: sigstore/cosign-installer@d572c9c13673d2e0a26fabf90b5748f36886883f
- name: Set up QEMU
@ -53,7 +46,6 @@ jobs:
# Login against a Docker registry except on PR
# https://github.com/docker/login-action
- name: Log into registry ${{ env.REGISTRY }}
if: github.event_name != 'pull_request'
uses: docker/login-action@28218f9b04b4f3f62068d7b6ce6ca5b26e35336c
with:
registry: ${{ env.REGISTRY }}
@ -71,6 +63,7 @@ jobs:
# Build and push Docker image with Buildx (don't push on PR)
# https://github.com/docker/build-push-action
- name: Build and push Docker image
if: github.event_name != 'pull_request' && github.ref != 'refs/heads/feature/test'
id: build-and-push
uses: docker/build-push-action@ac9327eae2b366085ac7f6a2d02df8aa8ead720a
with:

51
.github/workflows/issue_lint.yml vendored Normal file
View File

@ -0,0 +1,51 @@
name: issue_lint
on:
issues:
types: [opened]
jobs:
issue_lint:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v3
- uses: actions/github-script@v6
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const fs = require('fs').promises;
const getTitles = (str) => (
[...str.matchAll(/^## (.*)/gm)].map((m) => m[0])
);
const titles = getTitles(context.payload.issue.body);
for (let file of await fs.readdir('.github/ISSUE_TEMPLATE')) {
if (!file.endsWith('.md')) {
continue;
}
const template = await fs.readFile(`.github/ISSUE_TEMPLATE/${file}`, 'utf-8');
const templateTitles = getTitles(template);
if (templateTitles.every((title) => titles.includes(title))) {
process.exit(0);
}
}
await github.rest.issues.createComment({
owner: context.issue.owner,
repo: context.issue.repo,
issue_number: context.issue.number,
body: '此issue由于不符合模板规范已经自动关闭请重新按照模板规范确保包含模板中所有章节标题再提交\n',
});
await github.rest.issues.update({
owner: context.issue.owner,
repo: context.issue.repo,
issue_number: context.issue.number,
state: 'closed',
});

View File

@ -13,9 +13,6 @@ jobs:
- name: 下载submodule源码
run: mv -f .gitmodules_github .gitmodules && git submodule sync && git submodule update --init
- name: apt-get安装依赖库(非必选)
run: sudo apt-get update && sudo apt-get install -y cmake libssl-dev libsdl-dev libavcodec-dev libavutil-dev libswscale-dev libresample-dev libusrsctp-dev
- name: 下载 SRTP
uses: actions/checkout@v2
with:
@ -24,13 +21,64 @@ jobs:
ref: v2.3.0
path: 3rdpart/libsrtp
- name: 编译 SRTP
run: cd 3rdpart/libsrtp && ./configure --enable-openssl && make -j4 && sudo make install
- name: 启动 Docker 容器, 在Docker 容器中执行脚本
run: |
docker pull centos:7
docker run -v $(pwd):/root -w /root --rm centos:7 sh -c "
set -x
yum install -y git wget gcc gcc-c++ make unzip ffmpeg-devel libavutil-devel libswscale-devel libresample-devel usrsctp-devel
- name: 编译
run: mkdir -p linux_build && cd linux_build && cmake .. -DENABLE_WEBRTC=true -DENABLE_FFMPEG=true && make -j $(nproc)
wget https://github.com/openssl/openssl/archive/refs/heads/OpenSSL_1_1_1-stable.zip
unzip OpenSSL_1_1_1-stable.zip
cd openssl-OpenSSL_1_1_1-stable
./config no-shared --prefix=/root/release
make -j $(nproc)
make install
cd ..
- name: 运行MediaServer
run: pwd && cd release/linux/Debug && sudo ./MediaServer -d &
wget https://github.com/Kitware/CMake/releases/download/v3.29.5/cmake-3.29.5.tar.gz
tar -xvf cmake-3.29.5.tar.gz
cd cmake-3.29.5
OPENSSL_ROOT_DIR=/root/release ./configure
make -j $(nproc)
make install
cd ..
cd 3rdpart/libsrtp && ./configure --enable-openssl --with-openssl-dir=/root/release && make -j $(nproc) && make install
cd ../../
mkdir -p linux_build && cd linux_build && cmake .. -DOPENSSL_ROOT_DIR=/root/release -DCMAKE_BUILD_TYPE=Release -DENABLE_FFMPEG=true && make -j $(nproc)
"
- name: 设置环境变量
run: |
echo "BRANCH=$(echo ${GITHUB_REF#refs/heads/} | tr -s "/\?%*:|\"<>" "_")" >> $GITHUB_ENV
echo "BRANCH2=$(echo ${GITHUB_REF#refs/heads/} )" >> $GITHUB_ENV
echo "DATE=$(date +%Y-%m-%d)" >> $GITHUB_ENV
- name: 打包二进制
id: upload
uses: actions/upload-artifact@v4
with:
name: ${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}
path: release/*
if-no-files-found: error
retention-days: 90
- name: issue评论
if: github.event_name != 'pull_request' && github.ref != 'refs/heads/feature/test'
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
github.rest.issues.createComment({
issue_number: 483,
owner: context.repo.owner,
repo: context.repo.repo,
body: '- 下载地址: [${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}](${{ steps.upload.outputs.artifact-url }})\n'
+ '- 分支: ${{ env.BRANCH2 }}\n'
+ '- git hash: ${{ github.sha }} \n'
+ '- 编译日期: ${{ env.DATE }}\n'
+ '- 打包ci名: ${{ github.workflow }}\n'
+ '- 说明: 本二进制在centos7(x64)上编译,请确保您的机器系统不低于此版本,并提前`sudo yum check-update && sudo yum install -y openssl-devel ffmpeg-devel libavutil-devel libswscale-devel libresample-devel usrsctp-devel`安装依赖项\n'
})

View File

@ -13,27 +13,52 @@ jobs:
- name: 下载submodule源码
run: mv -f .gitmodules_github .gitmodules && git submodule sync && git submodule update --init
# - name: 安装brew
# run: ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
#
# - name: brew安装依赖库(非必选)
# run: brew update && brew install cmake openssl sdl2 ffmpeg
# - name: 下载 SRTP
# uses: actions/checkout@v2
# with:
# repository: cisco/libsrtp
# fetch-depth: 1
# ref: v2.3.0
# path: 3rdpart/libsrtp
#
# - name: 编译 SRTP
# run: cd 3rdpart/libsrtp && ./configure --enable-openssl && make -j4 && sudo make install
- name: 配置 vcpkg
uses: lukka/run-vcpkg@v7
with:
vcpkgDirectory: '${{github.workspace}}/vcpkg'
vcpkgTriplet: arm64-osx
# 2024.06.01
vcpkgGitCommitId: '47364fbc300756f64f7876b549d9422d5f3ec0d3'
vcpkgArguments: 'openssl libsrtp[openssl]'
- name: 编译
run: mkdir -p build && cd build && cmake .. && make -j $(nproc)
uses: lukka/run-cmake@v3
with:
useVcpkgToolchainFile: true
buildDirectory: '${{github.workspace}}/build'
cmakeAppendedArgs: ''
cmakeBuildType: 'Release'
- name: 运行MediaServer
run: pwd && cd release/darwin/Debug && sudo ./MediaServer -d &
- name: 设置环境变量
run: |
echo "BRANCH=$(echo ${GITHUB_REF#refs/heads/} | tr -s "/\?%*:|\"<>" "_")" >> $GITHUB_ENV
echo "BRANCH2=$(echo ${GITHUB_REF#refs/heads/} )" >> $GITHUB_ENV
echo "DATE=$(date +%Y-%m-%d)" >> $GITHUB_ENV
- name: 打包二进制
id: upload
uses: actions/upload-artifact@v4
with:
name: ${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}
path: release/*
if-no-files-found: error
retention-days: 90
- name: issue评论
if: github.event_name != 'pull_request' && github.ref != 'refs/heads/feature/test'
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
github.rest.issues.createComment({
issue_number: 483,
owner: context.repo.owner,
repo: context.repo.repo,
body: '- 下载地址: [${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}](${{ steps.upload.outputs.artifact-url }})\n'
+ '- 分支: ${{ env.BRANCH2 }}\n'
+ '- git hash: ${{ github.sha }} \n'
+ '- 编译日期: ${{ env.DATE }}\n'
+ '- 打包ci名: ${{ github.workflow }}\n'
+ '- 说明: 此二进制为arm64版本\n'
})

View File

@ -4,7 +4,7 @@ on: [pull_request]
jobs:
check:
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
with:

View File

@ -17,14 +17,50 @@ jobs:
with:
vcpkgDirectory: '${{github.workspace}}/vcpkg'
vcpkgTriplet: x64-windows-static
# 2021.05.12
vcpkgGitCommitId: '5568f110b509a9fd90711978a7cb76bae75bb092'
vcpkgArguments: 'openssl libsrtp'
# 2024.06.01
vcpkgGitCommitId: '47364fbc300756f64f7876b549d9422d5f3ec0d3'
vcpkgArguments: 'openssl libsrtp[openssl]'
- name: 编译
uses: lukka/run-cmake@v3
with:
useVcpkgToolchainFile: true
buildDirectory: '${{github.workspace}}/build'
cmakeAppendedArgs: '-DCMAKE_ENABLE_WEBRTC:BOOL=ON'
cmakeBuildType: 'RelWithDebInfo'
cmakeAppendedArgs: ''
cmakeBuildType: 'Release'
- name: 设置环境变量
run: |
$dateString = Get-Date -Format "yyyy-MM-dd"
$branch = $env:GITHUB_REF -replace "refs/heads/", "" -replace "[\\/\\\?\%\*:\|\x22<>]", "_"
$branch2 = $env:GITHUB_REF -replace "refs/heads/", ""
echo "BRANCH=$branch" >> $env:GITHUB_ENV
echo "BRANCH2=$branch2" >> $env:GITHUB_ENV
echo "DATE=$dateString" >> $env:GITHUB_ENV
- name: 打包二进制
id: upload
uses: actions/upload-artifact@v4
with:
name: ${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}
path: release/*
if-no-files-found: error
retention-days: 90
- name: issue评论
if: github.event_name != 'pull_request' && github.ref != 'refs/heads/feature/test'
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
github.rest.issues.createComment({
issue_number: 483,
owner: context.repo.owner,
repo: context.repo.repo,
body: '- 下载地址: [${{ github.workflow }}_${{ env.BRANCH }}_${{ env.DATE }}](${{ steps.upload.outputs.artifact-url }})\n'
+ '- 分支: ${{ env.BRANCH2 }}\n'
+ '- git hash: ${{ github.sha }} \n'
+ '- 编译日期: ${{ env.DATE }}\n'
+ '- 打包ci名: ${{ github.workflow }}\n'
+ '- 说明: 此二进制为x64版本\n'
})

@ -1 +1 @@
Subproject commit 26d54bbc7b1860a450434dce49bbc8fcbcbae88b
Subproject commit 5144e2aa521df6d473308bfb31172054772a634f

View File

@ -13,7 +13,6 @@
#include <stdio.h>
#ifndef NDEBUG
#ifdef assert
#undef assert
#endif//assert
@ -27,8 +26,5 @@
#endif
#define assert(exp) Assert_Throw(!(exp), #exp, __FUNCTION__, __FILE__, __LINE__, NULL)
#else
#define assert(e) ((void)0)
#endif//NDEBUG
#endif //ZLMEDIAKIT_ASSERT_H

View File

@ -403,6 +403,8 @@ if(OPENSSL_FOUND AND ENABLE_OPENSSL)
update_cached_list(MK_LINK_LIBRARIES ${OPENSSL_LIBRARIES})
if(CMAKE_SYSTEM_NAME MATCHES "Linux" AND OPENSSL_USE_STATIC_LIBS)
update_cached_list(MK_LINK_LIBRARIES ${CMAKE_DL_LIBS})
elseif(CMAKE_SYSTEM_NAME MATCHES "Windows" AND OPENSSL_USE_STATIC_LIBS)
update_cached_list(MK_LINK_LIBRARIES Crypt32)
endif()
else()
set(ENABLE_OPENSSL OFF)

View File

@ -159,12 +159,15 @@
- 2、作为独立的流媒体服务器使用不想做c/c++开发的,可以参考 [restful api](https://github.com/ZLMediaKit/ZLMediaKit/wiki/MediaServer支持的HTTP-API) 和 [web hook](https://github.com/ZLMediaKit/ZLMediaKit/wiki/MediaServer支持的HTTP-HOOK-API ).
- 3、如果想做c/c++开发,添加业务逻辑增加功能,可以参考这里的[测试程序](https://github.com/ZLMediaKit/ZLMediaKit/tree/master/tests).
## 二进制文件下载
zlmediakit采用 github action 持续集成自动编译打包上传编译产出包,请在[issue列表](https://github.com/ZLMediaKit/ZLMediaKit/issues/483)下载最新sdk库文件以及可执行文件。
## Docker 镜像
你可以从Docker Hub下载已经编译好的镜像并启动它
```bash
#此镜像为github持续集成自动编译推送,跟代码(master分支)保持最新状态
#此镜像为github action 持续集成自动编译推送,跟代码(master分支)保持最新状态
docker run -id -p 1935:1935 -p 8080:80 -p 8443:443 -p 8554:554 -p 10000:10000 -p 10000:10000/udp -p 8000:8000/udp -p 9000:9000/udp zlmediakit/zlmediakit:master
```
@ -188,6 +191,7 @@ bash build_docker_images.sh
- [jessibuca](https://github.com/langhuihui/jessibuca) 基于wasm支持H265的播放器
- [wsPlayer](https://github.com/v354412101/wsPlayer) 基于MSE的websocket-fmp4播放器
- [BXC_gb28181Player](https://github.com/any12345com/BXC_gb28181Player) C++开发的支持国标GB28181协议的视频流播放器
- [RTCPlayer](https://github.com/leo94666/RTCPlayer) 一个基于Android客户端的的RTC播放器
- WEB管理网站
- [zlm_webassist](https://github.com/1002victor/zlm_webassist) 本项目配套的前后端分离web管理项目
@ -222,7 +226,7 @@ bash build_docker_images.sh
- 请关注微信公众号获取最新消息推送:
<img src=https://user-images.githubusercontent.com/11495632/232451702-4c50bc72-84d8-4c94-af2b-57290088ba7a.png width=15% />
- 也可以自愿有偿加入知识星球咨询和获取资料
- 也可以自愿有偿加入知识星球咨询、获取资料以及加入微信技术群
<img src= https://user-images.githubusercontent.com/11495632/231946329-aa8517b0-3cf5-49cf-8c75-a93ed58cb9d2.png width=30% />
@ -363,6 +367,7 @@ bash build_docker_images.sh
[jamesZHANG500](https://github.com/jamesZHANG500)
[weidelong](https://github.com/wdl1697454803)
[小强先生](https://github.com/linshangqiang)
[李之阳](https://github.com/leo94666)
同时感谢JetBrains对开源项目的支持本项目使用CLion开发与调试

View File

@ -324,6 +324,10 @@ git submodule update --init
});
```
## Binary file download
zlmediakit uses github action to continuously integrate automatic compilation package and upload the compilation output package. Please download the latest sdk library file and executable file at [issue list] (https://github.com/ZLMediaKit/ZLMediaKit/issues/483).
## Docker Image
You can download the pre-compiled image from Docker Hub and start it:
@ -369,6 +373,8 @@ bash build_docker_images.sh
- [WebSocket-fmp4 player based on MSE](https://github.com/v354412101/wsPlayer)
- [Domestic webrtc sdk(metaRTC)](https://github.com/metartc/metaRTC)
- [GB28181 player implemented in C++](https://github.com/any12345com/BXC_gb28181Player)
- [Android RTCPlayer](https://github.com/leo94666/RTCPlayer)
## License
@ -521,6 +527,7 @@ Thanks to all those who have supported this project in various ways, including b
[jamesZHANG500](https://github.com/jamesZHANG500)
[weidelong](https://github.com/wdl1697454803)
[小强先生](https://github.com/linshangqiang)
[李之阳](https://github.com/leo94666)
Also thank to JetBrains for their support for open source project, we developed and debugged zlmediakit with CLion:

View File

@ -51,6 +51,9 @@ extern "C" {
//输出日志到回调函数(mk_events::on_mk_log)
#define LOG_CALLBACK (1 << 2)
//向下兼容
#define mk_env_init1 mk_env_init2
//回调user_data回调函数
typedef void(API_CALL *on_user_data_free)(void *user_data);
@ -104,7 +107,7 @@ API_EXPORT void API_CALL mk_stop_all_server();
* @param ssl ssl证书内容或路径NULL
* @param ssl_pwd NULL
*/
API_EXPORT void API_CALL mk_env_init1(int thread_num,
API_EXPORT void API_CALL mk_env_init2(int thread_num,
int log_level,
int log_mask,
const char *log_file_path,

View File

@ -132,7 +132,12 @@ typedef struct {
/**
* mp4分片文件成功后广播
*/
void (API_CALL *on_mk_record_mp4)(const mk_mp4_info mp4);
void (API_CALL *on_mk_record_mp4)(const mk_record_info mp4);
/**
* ts分片文件成功后广播
*/
void (API_CALL *on_mk_record_ts)(const mk_record_info ts);
/**
* shell登录鉴权

View File

@ -18,29 +18,42 @@
extern "C" {
#endif
///////////////////////////////////////////MP4Info/////////////////////////////////////////////
//MP4Info对象的C映射
typedef struct mk_mp4_info_t *mk_mp4_info;
///////////////////////////////////////////RecordInfo/////////////////////////////////////////////
//RecordInfo对象的C映射
typedef struct mk_record_info_t *mk_record_info;
// GMT 标准时间,单位秒
API_EXPORT uint64_t API_CALL mk_mp4_info_get_start_time(const mk_mp4_info ctx);
API_EXPORT uint64_t API_CALL mk_record_info_get_start_time(const mk_record_info ctx);
// 录像长度,单位秒
API_EXPORT float API_CALL mk_mp4_info_get_time_len(const mk_mp4_info ctx);
API_EXPORT float API_CALL mk_record_info_get_time_len(const mk_record_info ctx);
// 文件大小,单位 BYTE
API_EXPORT size_t API_CALL mk_mp4_info_get_file_size(const mk_mp4_info ctx);
API_EXPORT size_t API_CALL mk_record_info_get_file_size(const mk_record_info ctx);
// 文件路径
API_EXPORT const char* API_CALL mk_mp4_info_get_file_path(const mk_mp4_info ctx);
API_EXPORT const char *API_CALL mk_record_info_get_file_path(const mk_record_info ctx);
// 文件名称
API_EXPORT const char* API_CALL mk_mp4_info_get_file_name(const mk_mp4_info ctx);
API_EXPORT const char *API_CALL mk_record_info_get_file_name(const mk_record_info ctx);
// 文件夹路径
API_EXPORT const char* API_CALL mk_mp4_info_get_folder(const mk_mp4_info ctx);
API_EXPORT const char *API_CALL mk_record_info_get_folder(const mk_record_info ctx);
// 播放路径
API_EXPORT const char* API_CALL mk_mp4_info_get_url(const mk_mp4_info ctx);
API_EXPORT const char *API_CALL mk_record_info_get_url(const mk_record_info ctx);
// 应用名称
API_EXPORT const char* API_CALL mk_mp4_info_get_vhost(const mk_mp4_info ctx);
API_EXPORT const char *API_CALL mk_record_info_get_vhost(const mk_record_info ctx);
// 流 ID
API_EXPORT const char* API_CALL mk_mp4_info_get_app(const mk_mp4_info ctx);
API_EXPORT const char *API_CALL mk_record_info_get_app(const mk_record_info ctx);
// 虚拟主机
API_EXPORT const char* API_CALL mk_mp4_info_get_stream(const mk_mp4_info ctx);
API_EXPORT const char *API_CALL mk_record_info_get_stream(const mk_record_info ctx);
//// 下面宏保障用户代码兼容性, 二进制abi不兼容用户需要重新编译链接 /////
#define mk_mp4_info mk_record_info
#define mk_mp4_info_get_start_time mk_record_info_get_start_time
#define mk_mp4_info_get_time_len mk_record_info_get_time_len
#define mk_mp4_info_get_file_size mk_record_info_get_file_size
#define mk_mp4_info_get_file_path mk_record_info_get_file_path
#define mk_mp4_info_get_file_name mk_record_info_get_file_name
#define mk_mp4_info_get_folder mk_record_info_get_folder
#define mk_mp4_info_get_url mk_record_info_get_url
#define mk_mp4_info_get_vhost mk_record_info_get_vhost
#define mk_mp4_info_get_app mk_record_info_get_app
#define mk_mp4_info_get_stream mk_record_info_get_stream
///////////////////////////////////////////Parser/////////////////////////////////////////////
//Parser对象的C映射

View File

@ -32,6 +32,7 @@ typedef struct mk_proxy_player_t *mk_proxy_player;
*/
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create(const char *vhost, const char *app, const char *stream, int hls_enabled, int mp4_enabled);
/**
*
* @param vhost __defaultVhost__
@ -43,6 +44,32 @@ API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create(const char *vhost, co
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create2(const char *vhost, const char *app, const char *stream, mk_ini option);
/**
*
* @param vhost __defaultVhost__
* @param app
* @param stream
* @param rtp_type rtsp播放方式:RTP_TCP = 0, RTP_UDP = 1, RTP_MULTICAST = 2
* @param hls_enabled hls
* @param mp4_enabled mp4
* @param retry_count <0
* @return
*/
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create3(const char *vhost, const char *app, const char *stream, int hls_enabled, int mp4_enabled, int retry_count);
/**
*
* @param vhost __defaultVhost__
* @param app
* @param stream
* @param option ProtocolOption相关配置
* @param retry_count <0
* @return
*/
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create4(const char *vhost, const char *app, const char *stream, mk_ini option, int retry_count);
/**
*
* @param ctx
@ -71,7 +98,9 @@ API_EXPORT void API_CALL mk_proxy_player_play(mk_proxy_player ctx, const char *u
* mk_proxy_player_release函数MediaSource.close()
* @param user_data mk_proxy_player_set_on_close函数设置
*/
typedef void(API_CALL *on_mk_proxy_player_close)(void *user_data, int err, const char *what, int sys_err);
typedef void(API_CALL *on_mk_proxy_player_cb)(void *user_data, int err, const char *what, int sys_err);
// 保持兼容
#define on_mk_proxy_player_close on_mk_proxy_player_cb
/**
* MediaSource.close()
@ -81,8 +110,17 @@ typedef void(API_CALL *on_mk_proxy_player_close)(void *user_data, int err, const
* @param cb
* @param user_data
*/
API_EXPORT void API_CALL mk_proxy_player_set_on_close(mk_proxy_player ctx, on_mk_proxy_player_close cb, void *user_data);
API_EXPORT void API_CALL mk_proxy_player_set_on_close2(mk_proxy_player ctx, on_mk_proxy_player_close cb, void *user_data, on_user_data_free user_data_free);
API_EXPORT void API_CALL mk_proxy_player_set_on_close(mk_proxy_player ctx, on_mk_proxy_player_cb cb, void *user_data);
API_EXPORT void API_CALL mk_proxy_player_set_on_close2(mk_proxy_player ctx, on_mk_proxy_player_cb cb, void *user_data, on_user_data_free user_data_free);
/**
*
* @param ctx
* @param cb
* @param user_data
* @param user_data_free
*/
API_EXPORT void API_CALL mk_proxy_player_set_on_play_result(mk_proxy_player ctx, on_mk_proxy_player_cb cb, void *user_data, on_user_data_free user_data_free);
/**
*

View File

@ -128,6 +128,15 @@ API_EXPORT char *API_CALL mk_ini_dump_string(mk_ini ini);
* @param file
*/
API_EXPORT void API_CALL mk_ini_dump_file(mk_ini ini, const char *file);
///////////////////////////////////////////统计/////////////////////////////////////////////
typedef void(API_CALL *on_mk_get_statistic_cb)(void *user_data, mk_ini ini);
/**
*
* @param ini
*/
API_EXPORT void API_CALL mk_get_statistic(on_mk_get_statistic_cb cb, void *user_data, on_user_data_free free_cb);
///////////////////////////////////////////日志/////////////////////////////////////////////

View File

@ -83,7 +83,7 @@ API_EXPORT void API_CALL mk_stop_all_server(){
stopAllTcpServer();
}
API_EXPORT void API_CALL mk_env_init1(int thread_num,
API_EXPORT void API_CALL mk_env_init2(int thread_num,
int log_level,
int log_mask,
const char *log_file_path,

View File

@ -42,7 +42,13 @@ API_EXPORT void API_CALL mk_events_listen(const mk_events *events){
NoticeCenter::Instance().addListener(&s_tag,Broadcast::kBroadcastRecordMP4,[](BroadcastRecordMP4Args){
if(s_events.on_mk_record_mp4){
s_events.on_mk_record_mp4((mk_mp4_info)&info);
s_events.on_mk_record_mp4((mk_record_info)&info);
}
});
NoticeCenter::Instance().addListener(&s_tag, Broadcast::kBroadcastRecordTs, [](BroadcastRecordTsArgs) {
if (s_events.on_mk_record_ts) {
s_events.on_mk_record_ts((mk_record_info)&info);
}
});

View File

@ -26,61 +26,61 @@ using namespace toolkit;
using namespace mediakit;
///////////////////////////////////////////RecordInfo/////////////////////////////////////////////
API_EXPORT uint64_t API_CALL mk_mp4_info_get_start_time(const mk_mp4_info ctx){
API_EXPORT uint64_t API_CALL mk_record_info_get_start_time(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->start_time;
}
API_EXPORT float API_CALL mk_mp4_info_get_time_len(const mk_mp4_info ctx){
API_EXPORT float API_CALL mk_record_info_get_time_len(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->time_len;
}
API_EXPORT size_t API_CALL mk_mp4_info_get_file_size(const mk_mp4_info ctx){
API_EXPORT size_t API_CALL mk_record_info_get_file_size(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->file_size;
}
API_EXPORT const char* API_CALL mk_mp4_info_get_file_path(const mk_mp4_info ctx){
API_EXPORT const char *API_CALL mk_record_info_get_file_path(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->file_path.c_str();
}
API_EXPORT const char* API_CALL mk_mp4_info_get_file_name(const mk_mp4_info ctx){
API_EXPORT const char *API_CALL mk_record_info_get_file_name(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->file_name.c_str();
}
API_EXPORT const char* API_CALL mk_mp4_info_get_folder(const mk_mp4_info ctx){
API_EXPORT const char *API_CALL mk_record_info_get_folder(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->folder.c_str();
}
API_EXPORT const char* API_CALL mk_mp4_info_get_url(const mk_mp4_info ctx){
API_EXPORT const char *API_CALL mk_record_info_get_url(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->url.c_str();
}
API_EXPORT const char* API_CALL mk_mp4_info_get_vhost(const mk_mp4_info ctx){
API_EXPORT const char *API_CALL mk_record_info_get_vhost(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->vhost.c_str();
}
API_EXPORT const char* API_CALL mk_mp4_info_get_app(const mk_mp4_info ctx){
API_EXPORT const char *API_CALL mk_record_info_get_app(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->app.c_str();
}
API_EXPORT const char* API_CALL mk_mp4_info_get_stream(const mk_mp4_info ctx){
API_EXPORT const char *API_CALL mk_record_info_get_stream(const mk_record_info ctx) {
assert(ctx);
RecordInfo *info = (RecordInfo *)ctx;
return info->stream.c_str();
@ -528,7 +528,7 @@ API_EXPORT void API_CALL mk_auth_invoker_clone_release(const mk_auth_invoker ctx
}
///////////////////////////////////////////WebRtcTransport/////////////////////////////////////////////
API_EXPORT void API_CALL mk_rtc_sendDatachannel(const mk_rtc_transport ctx, uint16_t streamId, uint32_t ppid, const char *msg, size_t len) {
API_EXPORT void API_CALL mk_rtc_send_datachannel(const mk_rtc_transport ctx, uint16_t streamId, uint32_t ppid, const char *msg, size_t len) {
#ifdef ENABLE_WEBRTC
assert(ctx && msg);
WebRtcTransport *transport = (WebRtcTransport *)ctx;

View File

@ -27,8 +27,9 @@ protected:
private:
bool _h265 = false;
bool _have_decode_frame = false;
onH264 _cb;
size_t _search_pos = 0;
toolkit::BufferLikeString _buffer;
};
void H264Splitter::setOnSplitted(H264Splitter::onH264 cb) {
@ -42,11 +43,21 @@ H264Splitter::~H264Splitter() {
}
ssize_t H264Splitter::onRecvHeader(const char *data, size_t len) {
_cb(data, len);
auto frame = Factory::getFrameFromPtr(_h265 ? CodecH265 : CodecH264, (char *)data, len, 0, 0);
if (_have_decode_frame && (frame->decodeAble() || frame->configFrame())) {
// 缓存中存在可解码帧且下一帧是可解码帧或者配置帧那么flush缓存
_cb(_buffer.data(), _buffer.size());
_buffer.assign(data, len);
_have_decode_frame = frame->decodeAble();
} else {
// 还需要缓存
_buffer.append(data, len);
_have_decode_frame = _have_decode_frame || frame->decodeAble();
}
return 0;
}
static const char *onSearchPacketTail_l(const char *data, size_t len) {
const char *H264Splitter::onSearchPacketTail(const char *data, size_t len) {
for (size_t i = 2; len > 2 && i < len - 2; ++i) {
//判断0x00 00 01
if (data[i] == 0 && data[i + 1] == 0 && data[i + 2] == 1) {
@ -60,28 +71,6 @@ static const char *onSearchPacketTail_l(const char *data, size_t len) {
return nullptr;
}
const char *H264Splitter::onSearchPacketTail(const char *data, size_t len) {
auto last_frame = data + _search_pos;
auto next_frame = onSearchPacketTail_l(last_frame, len - _search_pos);
if (!next_frame) {
return nullptr;
}
auto last_frame_len = next_frame - last_frame;
Frame::Ptr frame;
if (_h265) {
frame = Factory::getFrameFromPtr(CodecH265, (char *)last_frame, last_frame_len, 0, 0);
} else {
frame = Factory::getFrameFromPtr(CodecH264, (char *)last_frame, last_frame_len, 0, 0);
}
if (frame->decodeAble()) {
_search_pos = 0;
return next_frame;
}
_search_pos += last_frame_len;
return nullptr;
}
////////////////////////////////////////////////////////////////////////////////////////////////////////
API_EXPORT mk_h264_splitter API_CALL mk_h264_splitter_create(on_mk_h264_splitter_frame cb, void *user_data, int is_h265) {

View File

@ -16,22 +16,30 @@ using namespace toolkit;
using namespace mediakit;
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create(const char *vhost, const char *app, const char *stream, int hls_enabled, int mp4_enabled) {
return mk_proxy_player_create3(vhost, app, stream, hls_enabled, mp4_enabled,-1);
}
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create2(const char *vhost, const char *app, const char *stream, mk_ini ini) {
return mk_proxy_player_create4(vhost, app, stream, ini, -1);
}
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create3(const char *vhost, const char *app, const char *stream, int hls_enabled, int mp4_enabled, int retry_count) {
assert(vhost && app && stream);
ProtocolOption option;
option.enable_hls = hls_enabled;
option.enable_mp4 = mp4_enabled;
PlayerProxy::Ptr *obj(new PlayerProxy::Ptr(new PlayerProxy(vhost, app, stream, option)));
PlayerProxy::Ptr *obj(new PlayerProxy::Ptr(new PlayerProxy(vhost, app, stream, option, retry_count)));
return (mk_proxy_player)obj;
}
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create2(const char *vhost, const char *app, const char *stream, mk_ini ini) {
API_EXPORT mk_proxy_player API_CALL mk_proxy_player_create4(const char *vhost, const char *app, const char *stream, mk_ini ini, int retry_count) {
assert(vhost && app && stream);
ProtocolOption option(*((mINI *)ini));
PlayerProxy::Ptr *obj(new PlayerProxy::Ptr(new PlayerProxy(vhost, app, stream, option)));
PlayerProxy::Ptr *obj(new PlayerProxy::Ptr(new PlayerProxy(vhost, app, stream, option, retry_count)));
return (mk_proxy_player)obj;
}
API_EXPORT void API_CALL mk_proxy_player_release(mk_proxy_player ctx) {
assert(ctx);
PlayerProxy::Ptr *obj = (PlayerProxy::Ptr *) ctx;
@ -76,6 +84,20 @@ API_EXPORT void API_CALL mk_proxy_player_set_on_close2(mk_proxy_player ctx, on_m
});
}
API_EXPORT void API_CALL mk_proxy_player_set_on_play_result(mk_proxy_player ctx, on_mk_proxy_player_close cb, void *user_data, on_user_data_free user_data_free) {
assert(ctx);
PlayerProxy::Ptr &obj = *((PlayerProxy::Ptr *)ctx);
std::shared_ptr<void> ptr(user_data, user_data_free ? user_data_free : [](void *) {});
obj->getPoller()->async([obj, cb, ptr]() {
// 切换线程再操作
obj->setPlayCallbackOnce([cb, ptr](const SockException &ex) {
if (cb) {
cb(ptr.get(), ex.getErrCode(), ex.what(), ex.getCustomCode());
}
});
});
}
API_EXPORT int API_CALL mk_proxy_player_total_reader_count(mk_proxy_player ctx){
assert(ctx);
PlayerProxy::Ptr &obj = *((PlayerProxy::Ptr *) ctx);

View File

@ -56,7 +56,7 @@ API_EXPORT void API_CALL mk_rtp_server_set_on_detach2(mk_rtp_server ctx, on_mk_r
RtpServer::Ptr *server = (RtpServer::Ptr *) ctx;
if (cb) {
std::shared_ptr<void> ptr(user_data, user_data_free ? user_data_free : [](void *) {});
(*server)->setOnDetach([cb, ptr]() {
(*server)->setOnDetach([cb, ptr](const SockException &ex) {
cb(ptr.get());
});
} else {

View File

@ -15,6 +15,9 @@
#include "Util/util.h"
#include "Util/mini.h"
#include "Util/logger.h"
#include "Util/TimeTicker.h"
#include "Poller/EventPoller.h"
#include "Thread/WorkThreadPool.h"
#include "Common/config.h"
using namespace std;
@ -132,6 +135,129 @@ API_EXPORT void API_CALL mk_ini_dump_file(mk_ini ini, const char *file) {
ptr->dumpFile(file);
}
extern uint64_t getTotalMemUsage();
extern uint64_t getTotalMemBlock();
extern uint64_t getThisThreadMemUsage();
extern uint64_t getThisThreadMemBlock();
extern std::vector<size_t> getBlockTypeSize();
extern uint64_t getTotalMemBlockByType(int type);
extern uint64_t getThisThreadMemBlockByType(int type);
namespace mediakit {
class MediaSource;
class MultiMediaSourceMuxer;
class FrameImp;
class Frame;
class RtpPacket;
class RtmpPacket;
} // namespace mediakit
namespace toolkit {
class TcpServer;
class TcpSession;
class UdpServer;
class UdpSession;
class TcpClient;
class Socket;
class Buffer;
class BufferRaw;
class BufferLikeString;
class BufferList;
} // namespace toolkit
API_EXPORT void API_CALL mk_get_statistic(on_mk_get_statistic_cb func, void *user_data, on_user_data_free free_cb) {
assert(func);
std::shared_ptr<void> data(user_data, free_cb);
auto cb = [func, data](const toolkit::mINI &ini) { func(data.get(), (mk_ini)&ini); };
auto obj = std::make_shared<toolkit::mINI>();
auto &val = *obj;
val["object.MediaSource"] = ObjectStatistic<MediaSource>::count();
val["object.MultiMediaSourceMuxer"] = ObjectStatistic<MultiMediaSourceMuxer>::count();
val["object.TcpServer"] = ObjectStatistic<TcpServer>::count();
val["object.TcpSession"] = ObjectStatistic<TcpSession>::count();
val["object.UdpServer"] = ObjectStatistic<UdpServer>::count();
val["object.UdpSession"] = ObjectStatistic<UdpSession>::count();
val["object.TcpClient"] = ObjectStatistic<TcpClient>::count();
val["object.Socket"] = ObjectStatistic<Socket>::count();
val["object.FrameImp"] = ObjectStatistic<FrameImp>::count();
val["object.Frame"] = ObjectStatistic<Frame>::count();
val["object.Buffer"] = ObjectStatistic<Buffer>::count();
val["object.BufferRaw"] = ObjectStatistic<BufferRaw>::count();
val["object.BufferLikeString"] = ObjectStatistic<BufferLikeString>::count();
val["object.BufferList"] = ObjectStatistic<BufferList>::count();
val["object.RtpPacket"] = ObjectStatistic<RtpPacket>::count();
val["object.RtmpPacket"] = ObjectStatistic<RtmpPacket>::count();
#ifdef ENABLE_MEM_DEBUG
auto bytes = getTotalMemUsage();
val["memory.memUsage"] = bytes;
val["memory.memUsageMB"] = (int)(bytes / 1024 / 1024);
val["memory.memBlock"] = getTotalMemBlock();
static auto block_type_size = getBlockTypeSize();
{
int i = 0;
string str;
size_t last = 0;
for (auto sz : block_type_size) {
str.append(to_string(last) + "~" + to_string(sz) + ":" + to_string(getTotalMemBlockByType(i++)) + ";");
last = sz;
}
str.pop_back();
val["memory.memBlockTypeCount"] = str;
}
#endif
auto thread_size = EventPollerPool::Instance().getExecutorSize() + WorkThreadPool::Instance().getExecutorSize();
std::shared_ptr<vector<toolkit::mINI>> thread_mem_info = std::make_shared<vector<toolkit::mINI>>(thread_size);
shared_ptr<void> finished(nullptr, [thread_mem_info, cb, obj](void *) {
for (auto &val : *thread_mem_info) {
auto thread_name = val["name"];
replace(thread_name, "...", "~~~");
auto prefix = "thread-" + thread_name + ".";
for (auto &pr : val) {
(*obj).emplace(prefix + pr.first, std::move(pr.second));
}
}
// 触发回调
cb(*obj);
});
auto pos = 0;
auto lambda = [&](const TaskExecutor::Ptr &executor) {
auto &val = (*thread_mem_info)[pos++];
val["load"] = executor->load();
Ticker ticker;
executor->async([finished, &val, ticker]() {
val["name"] = getThreadName();
val["delay"] = ticker.elapsedTime();
#ifdef ENABLE_MEM_DEBUG
auto bytes = getThisThreadMemUsage();
val["memUsage"] = bytes;
val["memUsageMB"] = bytes / 1024 / 1024;
val["memBlock"] = getThisThreadMemBlock();
{
int i = 0;
string str;
size_t last = 0;
for (auto sz : block_type_size) {
str.append(to_string(last) + "~" + to_string(sz) + ":" + to_string(getThisThreadMemBlockByType(i++)) + ";");
last = sz;
}
str.pop_back();
val["memBlockTypeCount"] = str;
}
#endif
});
};
EventPollerPool::Instance().for_each(lambda);
WorkThreadPool::Instance().for_each(lambda);
}
API_EXPORT void API_CALL mk_log_printf(int level, const char *file, const char *function, int line, const char *fmt, ...) {
va_list ap;
va_start(ap, fmt);

View File

@ -189,6 +189,14 @@ static void on_mk_webrtc_get_answer_sdp_func(void *user_data, const char *answer
free((void *)answer);
}
}
void API_CALL on_get_statistic_cb(void *user_data, mk_ini ini) {
const char *response_header[] = { NULL };
char *str = mk_ini_dump_string(ini);
mk_http_response_invoker_do_string(user_data, 200, response_header, str);
mk_free(str);
}
/**
* http api请求广播(GET/POST)
* @param parser http请求内容对象
@ -247,6 +255,9 @@ void API_CALL on_mk_http_request(const mk_parser parser,
mk_webrtc_get_answer_sdp(mk_http_response_invoker_clone(invoker), on_mk_webrtc_get_answer_sdp_func,
mk_parser_get_url_param(parser, "type"), mk_parser_get_content(parser, NULL), rtc_url);
} else if (strcmp(url, "/index/api/getStatistic") == 0) {
//拦截api: /index/api/webrtc
mk_get_statistic(on_get_statistic_cb, mk_http_response_invoker_clone(invoker), (on_user_data_free) mk_http_response_invoker_clone_release);
} else {
*consumed = 0;
return;
@ -387,7 +398,7 @@ void API_CALL on_mk_rtsp_auth(const mk_media_info url_info,
/**
* mp4分片文件成功后广播
*/
void API_CALL on_mk_record_mp4(const mk_mp4_info mp4) {
void API_CALL on_mk_record_mp4(const mk_record_info mp4) {
log_printf(LOG_LEV,
"\nstart_time: %d\n"
"time_len: %d\n"
@ -399,16 +410,16 @@ void API_CALL on_mk_record_mp4(const mk_mp4_info mp4) {
"vhost: %s\n"
"app: %s\n"
"stream: %s\n",
mk_mp4_info_get_start_time(mp4),
mk_mp4_info_get_time_len(mp4),
mk_mp4_info_get_file_size(mp4),
mk_mp4_info_get_file_path(mp4),
mk_mp4_info_get_file_name(mp4),
mk_mp4_info_get_folder(mp4),
mk_mp4_info_get_url(mp4),
mk_mp4_info_get_vhost(mp4),
mk_mp4_info_get_app(mp4),
mk_mp4_info_get_stream(mp4));
mk_record_info_get_start_time(mp4),
mk_record_info_get_time_len(mp4),
mk_record_info_get_file_size(mp4),
mk_record_info_get_file_path(mp4),
mk_record_info_get_file_name(mp4),
mk_record_info_get_folder(mp4),
mk_record_info_get_url(mp4),
mk_record_info_get_vhost(mp4),
mk_record_info_get_app(mp4),
mk_record_info_get_stream(mp4));
}
/**

View File

@ -126,6 +126,8 @@ wait_track_ready_ms=10000
wait_add_track_ms=3000
#如果track未就绪我们先缓存帧数据但是有最大个数限制防止内存溢出
unready_frame_cache=100
#是否启用观看人数变化事件广播置1则启用置0则关闭
broadcast_player_count_changed=0
[hls]
#hls写文件的buf大小调整参数可以提高文件io性能
@ -367,6 +369,24 @@ start_bitrate=0
max_bitrate=0
min_bitrate=0
#nack接收端
#Nack缓存包最早时间间隔
maxNackMS=5000
#Nack包检查间隔(包数量)
rtpCacheCheckInterval=100
#nack发送端
#最大保留的rtp丢包状态个数
nackMaxSize=2048
#rtp丢包状态最长保留时间
nackMaxMS=3000
#nack最多请求重传次数
nackMaxCount=15
#nack重传频率rtt的倍数
nackIntervalRatio=1.0
#nack包中rtp个数减小此值可以让nack包响应更灵敏
nackRtpSize=8
[srt]
#srt播放推流、播放超时时间,单位秒
timeoutSec=5

View File

@ -43,7 +43,7 @@ WORKDIR /opt/media/ZLMediaKit
# 3rdpart init
WORKDIR /opt/media/ZLMediaKit/3rdpart
RUN wget https://mirror.ghproxy.com/https://github.com/cisco/libsrtp/archive/v2.3.0.tar.gz -O libsrtp-2.3.0.tar.gz && \
RUN wget https://github.com/cisco/libsrtp/archive/v2.3.0.tar.gz -O libsrtp-2.3.0.tar.gz && \
tar xfv libsrtp-2.3.0.tar.gz && \
mv libsrtp-2.3.0 libsrtp && \
cd libsrtp && ./configure --enable-openssl && make -j $(nproc) && make install

View File

@ -306,13 +306,12 @@ static int getBits(void *pvHandle, int iN)
uint8_t u8Nbyte;
uint8_t u8Shift;
uint32_t u32Result = 0;
int iRet = 0;
uint32_t iRet = 0;
int iResoLen = 0;
if(NULL == ptPtr)
{
RPT(RPT_ERR, "NULL pointer");
iRet = -1;
goto exit;
}
@ -324,7 +323,6 @@ static int getBits(void *pvHandle, int iN)
iResoLen = getBitsLeft(ptPtr);
if(iResoLen < iN)
{
iRet = -1;
goto exit;
}

View File

@ -22,11 +22,12 @@
#include <map>
#include <iostream>
#include "Common/JemallocUtil.h"
#include "Common/macros.h"
#include "System.h"
#include "Util/util.h"
#include "Util/logger.h"
#include "Util/uv_errno.h"
#include "Common/macros.h"
#include "Common/JemallocUtil.h"
using namespace std;
using namespace toolkit;
@ -66,6 +67,16 @@ static void save_jemalloc_stats() {
out.flush();
}
static std::string get_func_symbol(const std::string &symbol) {
size_t pos1 = symbol.find("(");
if (pos1 == string::npos) {
return "";
}
size_t pos2 = symbol.find("+", pos1);
auto ret = symbol.substr(pos1 + 1, pos2 - pos1 - 1);
return ret;
}
static void sig_crash(int sig) {
signal(sig, SIG_DFL);
void *array[MAX_STACK_FRAMES];
@ -78,6 +89,10 @@ static void sig_crash(int sig) {
std::string symbol(strings[i]);
ref.emplace_back(symbol);
#if defined(__linux) || defined(__linux__)
auto func_symbol = get_func_symbol(symbol);
if (!func_symbol.empty()) {
ref.emplace_back(toolkit::demangle(func_symbol.data()));
}
static auto addr2line = [](const string &address) {
string cmd = StrPrinter << "addr2line -C -f -e " << exePath() << " " << address;
return System::execute(cmd);

View File

@ -45,7 +45,7 @@
#include "Http/HttpRequester.h"
#include "Player/PlayerProxy.h"
#include "Pusher/PusherProxy.h"
#include "Rtp/RtpSelector.h"
#include "Rtp/RtpProcess.h"
#include "Record/MP4Reader.h"
#if defined(ENABLE_RTPPROXY)
@ -289,6 +289,8 @@ static inline void addHttpListener(){
it->second(parser, invoker, sender);
} catch (ApiRetException &ex) {
responseApi(ex.code(), ex.what(), invoker);
auto helper = static_cast<SocketHelper &>(sender).shared_from_this();
helper->getPoller()->async([helper, ex]() { helper->shutdown(SockException(Err_shutdown, ex.what())); }, false);
}
#ifdef ENABLE_MYSQL
catch(SqlException &ex){
@ -321,6 +323,11 @@ public:
return _map.erase(key);
}
size_t size() {
std::lock_guard<std::recursive_mutex> lck(_mtx);
return _map.size();
}
Pointer find(const std::string &key) const {
std::lock_guard<std::recursive_mutex> lck(_mtx);
auto it = _map.find(key);
@ -478,7 +485,7 @@ uint16_t openRtpServer(uint16_t local_port, const string &stream_id, int tcp_mod
auto server = s_rtp_server.makeWithAction(stream_id, [&](RtpServer::Ptr server) {
server->start(local_port, stream_id, (RtpServer::TcpMode)tcp_mode, local_ip.c_str(), re_use_port, ssrc, only_track, multiplex);
});
server->setOnDetach([stream_id]() {
server->setOnDetach([stream_id](const SockException &ex) {
//设置rtp超时移除事件
s_rtp_server.erase(stream_id);
});
@ -1191,8 +1198,8 @@ void installWebApi() {
api_regist("/index/api/getRtpInfo",[](API_ARGS_MAP){
CHECK_SECRET();
CHECK_ARGS("stream_id");
auto process = RtpSelector::Instance().getProcess(allArgs["stream_id"], false);
auto src = MediaSource::find(DEFAULT_VHOST, kRtpAppName, allArgs["stream_id"]);
auto process = src ? src->getRtpProcess() : nullptr;
if (!process) {
val["exist"] = false;
return;
@ -1431,9 +1438,10 @@ void installWebApi() {
CHECK_SECRET();
CHECK_ARGS("stream_id");
//只是暂停流的检查流媒体服务器做为流负载服务收流就转发RTSP/RTMP有自己暂停协议
auto rtp_process = RtpSelector::Instance().getProcess(allArgs["stream_id"], false);
if (rtp_process) {
rtp_process->setStopCheckRtp(true);
auto src = MediaSource::find(DEFAULT_VHOST, kRtpAppName, allArgs["stream_id"]);
auto process = src ? src->getRtpProcess() : nullptr;
if (process) {
process->setStopCheckRtp(true);
} else {
val["code"] = API::NotFound;
}
@ -1442,9 +1450,10 @@ void installWebApi() {
api_regist("/index/api/resumeRtpCheck", [](API_ARGS_MAP) {
CHECK_SECRET();
CHECK_ARGS("stream_id");
auto rtp_process = RtpSelector::Instance().getProcess(allArgs["stream_id"], false);
if (rtp_process) {
rtp_process->setStopCheckRtp(false);
auto src = MediaSource::find(DEFAULT_VHOST, kRtpAppName, allArgs["stream_id"]);
auto process = src ? src->getRtpProcess() : nullptr;
if (process) {
process->setStopCheckRtp(false);
} else {
val["code"] = API::NotFound;
}
@ -1588,12 +1597,14 @@ void installWebApi() {
auto record_path = Recorder::getRecordPath(Recorder::type_mp4, tuple, allArgs["customized_path"]);
auto period = allArgs["period"];
record_path = record_path + period + "/";
bool recording = false;
auto name = allArgs["name"];
if (!name.empty()) {
// 删除指定文件
record_path += name;
}
bool recording = false;
{
} else {
// 删除文件夹,先判断该流是否正在录制中
auto src = MediaSource::find(allArgs["vhost"], allArgs["app"], allArgs["stream"]);
if (src && src->isRecording(Recorder::type_mp4)) {
recording = true;
@ -1762,7 +1773,7 @@ void installWebApi() {
, _session_id(std::move(session_id)) {}
~WebRtcArgsImp() override = default;
variant operator[](const string &key) const override {
toolkit::variant operator[](const string &key) const override {
if (key == "url") {
return getUrl();
}

View File

@ -321,6 +321,8 @@ int H264Encoder::inputData(char *yuv[3], int linesize[3], int64_t cts, H264Frame
_aFrames[i].iType = pNal.i_type;
_aFrames[i].iLength = pNal.i_payload;
_aFrames[i].pucData = pNal.p_payload;
_aFrames[i].dts = _pPicOut->i_dts;
_aFrames[i].pts = _pPicOut->i_pts;
}
*out_frame = _aFrames;
return iNal;

View File

@ -27,6 +27,9 @@ public:
int iType;
int iLength;
uint8_t *pucData;
int64_t dts;
int64_t pts;
} H264Frame;
H264Encoder();

View File

@ -39,7 +39,7 @@ bool DevChannel::inputYUV(char *yuv[3], int linesize[3], uint64_t cts) {
int frames = _pH264Enc->inputData(yuv, linesize, cts, &out_frames);
bool ret = false;
for (int i = 0; i < frames; i++) {
ret = inputH264((char *) out_frames[i].pucData, out_frames[i].iLength, cts) ? true : ret;
ret = inputH264((char *) out_frames[i].pucData, out_frames[i].iLength, out_frames[i].dts, out_frames[i].pts) ? true : ret;
}
return ret;
}

View File

@ -271,9 +271,14 @@ toolkit::EventPoller::Ptr MediaSource::getOwnerPoller() {
throw std::runtime_error(toolkit::demangle(typeid(*this).name()) + "::getOwnerPoller failed: " + getUrl());
}
std::shared_ptr<MultiMediaSourceMuxer> MediaSource::getMuxer() {
std::shared_ptr<MultiMediaSourceMuxer> MediaSource::getMuxer() const {
auto listener = _listener.lock();
return listener ? listener->getMuxer(*this) : nullptr;
return listener ? listener->getMuxer(const_cast<MediaSource&>(*this)) : nullptr;
}
std::shared_ptr<RtpProcess> MediaSource::getRtpProcess() const {
auto listener = _listener.lock();
return listener ? listener->getRtpProcess(const_cast<MediaSource&>(*this)) : nullptr;
}
void MediaSource::onReaderChanged(int size) {
@ -652,6 +657,10 @@ MediaSource::Ptr MediaSource::createFromMP4(const string &schema, const string &
/////////////////////////////////////MediaSourceEvent//////////////////////////////////////
void MediaSourceEvent::onReaderChanged(MediaSource &sender, int size){
GET_CONFIG(bool, enable, General::kBroadcastPlayerCountChanged);
if (enable) {
NOTICE_EMIT(BroadcastPlayerCountChangedArgs, Broadcast::kBroadcastPlayerCountChanged, sender.getMediaTuple(), sender.totalReaderCount());
}
if (size || sender.totalReaderCount()) {
//还有人观看该视频,不触发关闭事件
_async_close_timer = nullptr;
@ -799,11 +808,16 @@ toolkit::EventPoller::Ptr MediaSourceEventInterceptor::getOwnerPoller(MediaSourc
throw std::runtime_error(toolkit::demangle(typeid(*this).name()) + "::getOwnerPoller failed");
}
std::shared_ptr<MultiMediaSourceMuxer> MediaSourceEventInterceptor::getMuxer(MediaSource &sender) {
std::shared_ptr<MultiMediaSourceMuxer> MediaSourceEventInterceptor::getMuxer(MediaSource &sender) const {
auto listener = _listener.lock();
return listener ? listener->getMuxer(sender) : nullptr;
}
std::shared_ptr<RtpProcess> MediaSourceEventInterceptor::getRtpProcess(MediaSource &sender) const {
auto listener = _listener.lock();
return listener ? listener->getRtpProcess(sender) : nullptr;
}
bool MediaSourceEventInterceptor::setupRecord(MediaSource &sender, Recorder::type type, bool start, const string &custom_path, size_t max_second) {
auto listener = _listener.lock();
if (!listener) {

View File

@ -41,6 +41,7 @@ enum class MediaOriginType : uint8_t {
std::string getOriginTypeString(MediaOriginType type);
class MediaSource;
class RtpProcess;
class MultiMediaSourceMuxer;
class MediaSourceEvent {
public:
@ -88,7 +89,9 @@ public:
// 获取所有track相关信息
virtual std::vector<Track::Ptr> getMediaTracks(MediaSource &sender, bool trackReady = true) const { return std::vector<Track::Ptr>(); };
// 获取MultiMediaSourceMuxer对象
virtual std::shared_ptr<MultiMediaSourceMuxer> getMuxer(MediaSource &sender) { return nullptr; }
virtual std::shared_ptr<MultiMediaSourceMuxer> getMuxer(MediaSource &sender) const { return nullptr; }
// 获取RtpProcess对象
virtual std::shared_ptr<RtpProcess> getRtpProcess(MediaSource &sender) const { return nullptr; }
class SendRtpArgs {
public:
@ -278,7 +281,8 @@ public:
bool stopSendRtp(MediaSource &sender, const std::string &ssrc) override;
float getLossRate(MediaSource &sender, TrackType type) override;
toolkit::EventPoller::Ptr getOwnerPoller(MediaSource &sender) override;
std::shared_ptr<MultiMediaSourceMuxer> getMuxer(MediaSource &sender) override;
std::shared_ptr<MultiMediaSourceMuxer> getMuxer(MediaSource &sender) const override;
std::shared_ptr<RtpProcess> getRtpProcess(MediaSource &sender) const override;
private:
std::weak_ptr<MediaSourceEvent> _listener;
@ -395,7 +399,9 @@ public:
// 获取所在线程
toolkit::EventPoller::Ptr getOwnerPoller();
// 获取MultiMediaSourceMuxer对象
std::shared_ptr<MultiMediaSourceMuxer> getMuxer();
std::shared_ptr<MultiMediaSourceMuxer> getMuxer() const;
// 获取RtpProcess对象
std::shared_ptr<RtpProcess> getRtpProcess() const;
////////////////static方法查找或生成MediaSource////////////////

View File

@ -470,8 +470,8 @@ EventPoller::Ptr MultiMediaSourceMuxer::getOwnerPoller(MediaSource &sender) {
}
}
std::shared_ptr<MultiMediaSourceMuxer> MultiMediaSourceMuxer::getMuxer(MediaSource &sender) {
return shared_from_this();
std::shared_ptr<MultiMediaSourceMuxer> MultiMediaSourceMuxer::getMuxer(MediaSource &sender) const {
return const_cast<MultiMediaSourceMuxer*>(this)->shared_from_this();
}
bool MultiMediaSourceMuxer::onTrackReady(const Track::Ptr &track) {

View File

@ -132,7 +132,7 @@ public:
/**
*
*/
std::shared_ptr<MultiMediaSourceMuxer> getMuxer(MediaSource &sender) override;
std::shared_ptr<MultiMediaSourceMuxer> getMuxer(MediaSource &sender) const override;
const ProtocolOption &getOption() const;
const MediaTuple &getMediaTuple() const;

View File

@ -64,6 +64,7 @@ const string kBroadcastRtcSctpFailed = "kBroadcastRtcSctpFailed";
const string kBroadcastRtcSctpClosed = "kBroadcastRtcSctpClosed";
const string kBroadcastRtcSctpSend = "kBroadcastRtcSctpSend";
const string kBroadcastRtcSctpReceived = "kBroadcastRtcSctpReceived";
const string kBroadcastPlayerCountChanged = "kBroadcastPlayerCountChanged";
} // namespace Broadcast
@ -82,6 +83,7 @@ const string kEnableFFmpegLog = GENERAL_FIELD "enable_ffmpeg_log";
const string kWaitTrackReadyMS = GENERAL_FIELD "wait_track_ready_ms";
const string kWaitAddTrackMS = GENERAL_FIELD "wait_add_track_ms";
const string kUnreadyFrameCache = GENERAL_FIELD "unready_frame_cache";
const string kBroadcastPlayerCountChanged = GENERAL_FIELD "broadcast_player_count_changed";
static onceToken token([]() {
mINI::Instance()[kFlowThreshold] = 1024;
@ -96,6 +98,7 @@ static onceToken token([]() {
mINI::Instance()[kWaitTrackReadyMS] = 10000;
mINI::Instance()[kWaitAddTrackMS] = 3000;
mINI::Instance()[kUnreadyFrameCache] = 100;
mINI::Instance()[kBroadcastPlayerCountChanged] = 0;
});
} // namespace General
@ -361,6 +364,7 @@ static onceToken token([]() {
namespace Client {
const string kNetAdapter = "net_adapter";
const string kRtpType = "rtp_type";
const string kRtspBeatType = "rtsp_beat_type";
const string kRtspUser = "rtsp_user";
const string kRtspPwd = "rtsp_pwd";
const string kRtspPwdIsMD5 = "rtsp_pwd_md5";

View File

@ -124,6 +124,10 @@ extern const std::string kBroadcastRtcSctpSend;
extern const std::string kBroadcastRtcSctpReceived;
#define BroadcastRtcSctpReceivedArgs WebRtcTransport& sender, uint16_t &streamId, uint32_t &ppid, const uint8_t *&msg, size_t &len
// 观看人数变化广播
extern const std::string kBroadcastPlayerCountChanged;
#define BroadcastPlayerCountChangedArgs const MediaTuple& args, const int& count
#define ReloadConfigTag ((void *)(0xFF))
#define RELOAD_KEY(arg, key) \
do { \
@ -196,6 +200,8 @@ extern const std::string kWaitTrackReadyMS;
extern const std::string kWaitAddTrackMS;
// 如果track未就绪我们先缓存帧数据但是有最大个数限制(100帧时大约4秒),防止内存溢出
extern const std::string kUnreadyFrameCache;
// 是否启用观看人数变化事件广播置1则启用置0则关闭
extern const std::string kBroadcastPlayerCountChanged;
} // namespace General
namespace Protocol {
@ -417,6 +423,9 @@ extern const std::string kNetAdapter;
// 设置rtp传输类型可选项有0(tcp默认)、1(udp)、2(组播)
// 设置方法:player[PlayerBase::kRtpType] = 0/1/2;
extern const std::string kRtpType;
// rtsp播放器发送信令心跳还是rtcp心跳可选项有0(同时发)、1(rtcp心跳)、2(信令心跳)
// 设置方法:player[PlayerBase::kRtspBeatType] = 0/1/2;
extern const std::string kRtspBeatType;
// rtsp认证用户名
extern const std::string kRtspUser;
// rtsp认证用用户密码可以是明文也可以是md5,md5密码生成方式 md5(username:realm:password)

View File

@ -32,7 +32,7 @@ class RecordInfo: public MediaTuple {
public:
time_t start_time; // GMT 标准时间,单位秒
float time_len; // 录像长度,单位秒
off_t file_size; // 文件大小,单位 BYTE
uint64_t file_size; // 文件大小,单位 BYTE
std::string file_path; // 文件路径
std::string file_name; // 文件名称
std::string folder; // 文件夹路径

View File

@ -11,26 +11,29 @@
#if defined(ENABLE_RTPPROXY)
#include "GB28181Process.h"
#include "RtpProcess.h"
#include "RtpSelector.h"
#include "Http/HttpTSPlayer.h"
#include "Util/File.h"
#include "Common/config.h"
using namespace std;
using namespace toolkit;
static constexpr char kRtpAppName[] = "rtp";
//在创建_muxer对象前(也就是推流鉴权成功前)需要先缓存frame这样可以防止丢包提高体验
//但是同时需要控制缓冲长度防止内存溢出。200帧数据大概有10秒数据应该足矣等待鉴权hook返回
static constexpr size_t kMaxCachedFrame = 200;
namespace mediakit {
RtpProcess::RtpProcess(const string &stream_id) {
RtpProcess::Ptr RtpProcess::createProcess(std::string stream_id) {
RtpProcess::Ptr ret(new RtpProcess(std::move(stream_id)));
ret->createTimer();
return ret;
}
RtpProcess::RtpProcess(string stream_id) {
_media_info.schema = kRtpAppName;
_media_info.vhost = DEFAULT_VHOST;
_media_info.app = kRtpAppName;
_media_info.stream = stream_id;
_media_info.stream = std::move(stream_id);
GET_CONFIG(string, dump_dir, RtpProxy::kDumpDir);
{
@ -75,6 +78,25 @@ RtpProcess::~RtpProcess() {
}
}
void RtpProcess::onManager() {
if (!alive()) {
onDetach(SockException(Err_timeout, "RtpProcess timeout"));
}
}
void RtpProcess::createTimer() {
//创建超时管理定时器
weak_ptr<RtpProcess> weakSelf = shared_from_this();
_timer = std::make_shared<Timer>(3.0f, [weakSelf] {
auto strongSelf = weakSelf.lock();
if (!strongSelf) {
return false;
}
strongSelf->onManager();
return true;
}, EventPollerPool::Instance().getPoller());
}
bool RtpProcess::inputRtp(bool is_udp, const Socket::Ptr &sock, const char *data, size_t len, const struct sockaddr *addr, uint64_t *dts_out) {
if (!isRtp(data, len)) {
WarnP(this) << "Not rtp packet";
@ -203,13 +225,14 @@ void RtpProcess::setOnlyTrack(OnlyTrack only_track) {
_only_track = only_track;
}
void RtpProcess::onDetach() {
void RtpProcess::onDetach(const SockException &ex) {
if (_on_detach) {
_on_detach();
WarnL << ex << ", stream_id: " << getIdentifier();
_on_detach(ex);
}
}
void RtpProcess::setOnDetach(function<void()> cb) {
void RtpProcess::setOnDetach(onDetachCB cb) {
_on_detach = std::move(cb);
}
@ -256,9 +279,6 @@ void RtpProcess::emitOnPublish() {
}
if (err.empty()) {
strong_self->_muxer = std::make_shared<MultiMediaSourceMuxer>(strong_self->_media_info, 0.0f, option);
if (!option.stream_replace.empty()) {
RtpSelector::Instance().addStreamReplace(strong_self->_media_info.stream, option.stream_replace);
}
switch (strong_self->_only_track) {
case kOnlyAudio: strong_self->_muxer->setOnlyAudio(); break;
case kOnlyVideo: strong_self->_muxer->enableAudio(false); break;
@ -294,6 +314,15 @@ std::shared_ptr<SockInfo> RtpProcess::getOriginSock(MediaSource &sender) const {
return const_cast<RtpProcess *>(this)->shared_from_this();
}
RtpProcess::Ptr RtpProcess::getRtpProcess(mediakit::MediaSource &sender) const {
return const_cast<RtpProcess *>(this)->shared_from_this();
}
bool RtpProcess::close(mediakit::MediaSource &sender) {
onDetach(SockException(Err_shutdown, "close media"));
return true;
}
toolkit::EventPoller::Ptr RtpProcess::getOwnerPoller(MediaSource &sender) {
if (_sock) {
return _sock->getPoller();

View File

@ -18,11 +18,14 @@
namespace mediakit {
class RtpProcess final : public RtcpContextForRecv, public toolkit::SockInfo, public MediaSinkInterface, public MediaSourceEventInterceptor, public std::enable_shared_from_this<RtpProcess>{
static constexpr char kRtpAppName[] = "rtp";
class RtpProcess final : public RtcpContextForRecv, public toolkit::SockInfo, public MediaSinkInterface, public MediaSourceEvent, public std::enable_shared_from_this<RtpProcess>{
public:
using Ptr = std::shared_ptr<RtpProcess>;
friend class RtpProcessHelper;
RtpProcess(const std::string &stream_id);
using onDetachCB = std::function<void(const toolkit::SockException &ex)>;
static Ptr createProcess(std::string stream_id);
~RtpProcess();
enum OnlyTrack { kAll = 0, kOnlyAudio = 1, kOnlyVideo = 2 };
@ -38,20 +41,16 @@ public:
*/
bool inputRtp(bool is_udp, const toolkit::Socket::Ptr &sock, const char *data, size_t len, const struct sockaddr *addr , uint64_t *dts_out = nullptr);
/**
*
*/
bool alive();
/**
* RtpSelector移除时触发
*/
void onDetach();
void onDetach(const toolkit::SockException &ex);
/**
* onDetach事件回调
*/
void setOnDetach(std::function<void()> cb);
void setOnDetach(onDetachCB cb);
/**
* onDetach事件回调,false检查RTP超时true停止
@ -88,10 +87,17 @@ protected:
std::shared_ptr<SockInfo> getOriginSock(MediaSource &sender) const override;
toolkit::EventPoller::Ptr getOwnerPoller(MediaSource &sender) override;
float getLossRate(MediaSource &sender, TrackType type) override;
Ptr getRtpProcess(mediakit::MediaSource &sender) const override;
bool close(mediakit::MediaSource &sender) override;
private:
RtpProcess(std::string stream_id);
void emitOnPublish();
void doCachedFunc();
bool alive();
void onManager();
void createTimer();
private:
OnlyTrack _only_track = kAll;
@ -102,12 +108,13 @@ private:
toolkit::Socket::Ptr _sock;
MediaInfo _media_info;
toolkit::Ticker _last_frame_time;
std::function<void()> _on_detach;
onDetachCB _on_detach;
std::shared_ptr<FILE> _save_file_rtp;
std::shared_ptr<FILE> _save_file_video;
ProcessInterface::Ptr _process;
MultiMediaSourceMuxer::Ptr _muxer;
std::atomic_bool _stop_rtp_check{false};
toolkit::Timer::Ptr _timer;
toolkit::Ticker _last_check_alive;
std::recursive_mutex _func_mtx;
std::deque<std::function<void()> > _cached_func;

View File

@ -1,168 +0,0 @@
/*
* Copyright (c) 2016-present The ZLMediaKit project authors. All Rights Reserved.
*
* This file is part of ZLMediaKit(https://github.com/ZLMediaKit/ZLMediaKit).
*
* Use of this source code is governed by MIT-like license that can be found in the
* LICENSE file in the root of the source tree. All contributing project authors
* may be found in the AUTHORS file in the root of the source tree.
*/
#if defined(ENABLE_RTPPROXY)
#include <stddef.h>
#include "RtpSelector.h"
#include "RtpSplitter.h"
using namespace std;
using namespace toolkit;
namespace mediakit{
INSTANCE_IMP(RtpSelector);
void RtpSelector::clear(){
lock_guard<decltype(_mtx_map)> lck(_mtx_map);
_map_rtp_process.clear();
_map_stream_replace.clear();
}
bool RtpSelector::getSSRC(const char *data, size_t data_len, uint32_t &ssrc){
if (data_len < 12) {
return false;
}
uint32_t *ssrc_ptr = (uint32_t *) (data + 8);
ssrc = ntohl(*ssrc_ptr);
return true;
}
RtpProcess::Ptr RtpSelector::getProcess(const string &stream_id,bool makeNew) {
lock_guard<decltype(_mtx_map)> lck(_mtx_map);
string stream_id_origin = stream_id;
auto it_replace = _map_stream_replace.find(stream_id);
if (it_replace != _map_stream_replace.end()) {
stream_id_origin = it_replace->second;
}
auto it = _map_rtp_process.find(stream_id_origin);
if (it == _map_rtp_process.end() && !makeNew) {
return nullptr;
}
if (it != _map_rtp_process.end() && makeNew) {
//已经被其他线程持有了,不得再被持有,否则会存在线程安全的问题
throw ProcessExisted(StrPrinter << "RtpProcess(" << stream_id_origin << ") already existed");
}
RtpProcessHelper::Ptr &ref = _map_rtp_process[stream_id_origin];
if (!ref) {
ref = std::make_shared<RtpProcessHelper>(stream_id_origin, shared_from_this());
ref->attachEvent();
createTimer();
}
return ref->getProcess();
}
void RtpSelector::createTimer() {
if (!_timer) {
//创建超时管理定时器
weak_ptr<RtpSelector> weakSelf = shared_from_this();
_timer = std::make_shared<Timer>(3.0f, [weakSelf] {
auto strongSelf = weakSelf.lock();
if (!strongSelf) {
return false;
}
strongSelf->onManager();
return true;
}, EventPollerPool::Instance().getPoller());
}
}
void RtpSelector::delProcess(const string &stream_id,const RtpProcess *ptr) {
RtpProcess::Ptr process;
{
lock_guard<decltype(_mtx_map)> lck(_mtx_map);
auto it = _map_rtp_process.find(stream_id);
if (it == _map_rtp_process.end()) {
return;
}
if (it->second->getProcess().get() != ptr) {
return;
}
process = it->second->getProcess();
_map_rtp_process.erase(it);
delStreamReplace(stream_id);
}
process->onDetach();
}
void RtpSelector::addStreamReplace(const string &stream_id, const std::string &stream_replace) {
lock_guard<decltype(_mtx_map)> lck(_mtx_map);
_map_stream_replace[stream_replace] = stream_id;
}
void RtpSelector::delStreamReplace(const string &stream_id) {
for (auto it = _map_stream_replace.begin(); it != _map_stream_replace.end(); ++it) {
if (it->second == stream_id) {
_map_stream_replace.erase(it);
break;
}
}
}
void RtpSelector::onManager() {
List<RtpProcess::Ptr> clear_list;
{
lock_guard<decltype(_mtx_map)> lck(_mtx_map);
for (auto it = _map_rtp_process.begin(); it != _map_rtp_process.end();) {
if (it->second->getProcess()->alive()) {
++it;
continue;
}
WarnL << "RtpProcess timeout:" << it->first;
clear_list.emplace_back(it->second->getProcess());
delStreamReplace(it->first);
it = _map_rtp_process.erase(it);
}
}
clear_list.for_each([](const RtpProcess::Ptr &process) {
process->onDetach();
});
}
RtpProcessHelper::RtpProcessHelper(const string &stream_id, const weak_ptr<RtpSelector> &parent) {
_stream_id = stream_id;
_parent = parent;
_process = std::make_shared<RtpProcess>(stream_id);
}
RtpProcessHelper::~RtpProcessHelper() {
auto process = std::move(_process);
try {
// flush时确保线程安全
process->getOwnerPoller(MediaSource::NullMediaSource())->async([process]() { process->flush(); });
} catch (...) {
// 忽略getOwnerPoller可能抛出的异常
}
}
void RtpProcessHelper::attachEvent() {
//主要目的是close回调触发时能把对象从RtpSelector中删除
_process->setDelegate(shared_from_this());
}
bool RtpProcessHelper::close(MediaSource &sender) {
//此回调在其他线程触发
auto parent = _parent.lock();
if (!parent) {
return false;
}
parent->delProcess(_stream_id, _process.get());
WarnL << "close media: " << sender.getUrl();
return true;
}
RtpProcess::Ptr &RtpProcessHelper::getProcess() {
return _process;
}
}//namespace mediakit
#endif//defined(ENABLE_RTPPROXY)

View File

@ -1,89 +0,0 @@
/*
* Copyright (c) 2016-present The ZLMediaKit project authors. All Rights Reserved.
*
* This file is part of ZLMediaKit(https://github.com/ZLMediaKit/ZLMediaKit).
*
* Use of this source code is governed by MIT-like license that can be found in the
* LICENSE file in the root of the source tree. All contributing project authors
* may be found in the AUTHORS file in the root of the source tree.
*/
#ifndef ZLMEDIAKIT_RTPSELECTOR_H
#define ZLMEDIAKIT_RTPSELECTOR_H
#if defined(ENABLE_RTPPROXY)
#include <stdint.h>
#include <mutex>
#include <unordered_map>
#include "RtpProcess.h"
#include "Common/MediaSource.h"
namespace mediakit{
class RtpSelector;
class RtpProcessHelper : public MediaSourceEvent , public std::enable_shared_from_this<RtpProcessHelper> {
public:
using Ptr = std::shared_ptr<RtpProcessHelper>;
RtpProcessHelper(const std::string &stream_id, const std::weak_ptr<RtpSelector > &parent);
~RtpProcessHelper();
void attachEvent();
RtpProcess::Ptr & getProcess();
protected:
// 通知其停止推流
bool close(MediaSource &sender) override;
private:
std::string _stream_id;
RtpProcess::Ptr _process;
std::weak_ptr<RtpSelector> _parent;
};
class RtpSelector : public std::enable_shared_from_this<RtpSelector>{
public:
class ProcessExisted : public std::runtime_error {
public:
template<typename ...T>
ProcessExisted(T && ...args) : std::runtime_error(std::forward<T>(args)...) {}
};
static bool getSSRC(const char *data,size_t data_len, uint32_t &ssrc);
static RtpSelector &Instance();
/**
*
*/
void clear();
/**
* rtp处理器
* @param stream_id id
* @param makeNew , true时
* @return rtp处理器
*/
RtpProcess::Ptr getProcess(const std::string &stream_id, bool makeNew);
/**
* rtp处理器
* @param stream_id id
* @param ptr rtp处理器指针
*/
void delProcess(const std::string &stream_id, const RtpProcess *ptr);
void addStreamReplace(const std::string &stream_id, const std::string &stream_replace);
private:
void onManager();
void createTimer();
void delStreamReplace(const std::string &stream_id);
private:
toolkit::Timer::Ptr _timer;
std::recursive_mutex _mtx_map;
std::unordered_map<std::string,RtpProcessHelper::Ptr> _map_rtp_process;
std::unordered_map<std::string,std::string> _map_stream_replace;
};
}//namespace mediakit
#endif//defined(ENABLE_RTPPROXY)
#endif //ZLMEDIAKIT_RTPSELECTOR_H

View File

@ -11,7 +11,7 @@
#if defined(ENABLE_RTPPROXY)
#include "Util/uv_errno.h"
#include "RtpServer.h"
#include "RtpSelector.h"
#include "RtpProcess.h"
#include "Rtcp/RtcpContext.h"
#include "Common/config.h"
@ -35,38 +35,34 @@ public:
_stream_id = std::move(stream_id);
}
~RtcpHelper() {
if (_process) {
// 删除rtp处理器
RtpSelector::Instance().delProcess(_stream_id, _process.get());
}
}
void setRtpServerInfo(uint16_t local_port, RtpServer::TcpMode mode, bool re_use_port, uint32_t ssrc, int only_track) {
_local_port = local_port;
_tcp_mode = mode;
_re_use_port = re_use_port;
_ssrc = ssrc;
_only_track = only_track;
_process = RtpProcess::createProcess(_stream_id);
_process->setOnlyTrack((RtpProcess::OnlyTrack)only_track);
_timeout_cb = [=]() mutable {
NOTICE_EMIT(BroadcastRtpServerTimeoutArgs, Broadcast::kBroadcastRtpServerTimeout, local_port, _stream_id, (int)mode, re_use_port, ssrc);
};
weak_ptr<RtcpHelper> weak_self = shared_from_this();
_process->setOnDetach([weak_self](const SockException &ex) {
if (auto strong_self = weak_self.lock()) {
if (strong_self->_on_detach) {
strong_self->_on_detach(ex);
}
if (ex.getErrCode() == Err_timeout) {
strong_self->_timeout_cb();
}
}
});
}
void setOnDetach(function<void()> cb) {
if (_process) {
_process->setOnDetach(std::move(cb));
} else {
_on_detach = std::move(cb);
}
}
void setOnDetach(RtpProcess::onDetachCB cb) { _on_detach = std::move(cb); }
RtpProcess::Ptr getProcess() const { return _process; }
void onRecvRtp(const Socket::Ptr &sock, const Buffer::Ptr &buf, struct sockaddr *addr) {
if (!_process) {
_process = RtpSelector::Instance().getProcess(_stream_id, true);
_process->setOnlyTrack((RtpProcess::OnlyTrack)_only_track);
_process->setOnDetach(std::move(_on_detach));
cancelDelayTask();
}
_process->inputRtp(true, sock, buf->data(), buf->size(), addr);
// 统计rtp接受情况用于发送rr包
auto header = (RtpHeader *)buf->data();
sendRtcp(ntohl(header->ssrc), addr);
@ -92,37 +88,12 @@ public:
// 收到sr rtcp后驱动返回rr rtcp
strong_self->sendRtcp(strong_self->_ssrc, (struct sockaddr *)(strong_self->_rtcp_addr.get()));
});
GET_CONFIG(uint64_t, timeoutSec, RtpProxy::kTimeoutSec);
_delay_task = _rtcp_sock->getPoller()->doDelayTask(timeoutSec * 1000, [weak_self]() {
if (auto strong_self = weak_self.lock()) {
auto process = RtpSelector::Instance().getProcess(strong_self->_stream_id, false);
if (!process && strong_self->_on_detach) {
strong_self->_on_detach();
}
if(process && strong_self->_on_detach){// tcp 链接防止断开不删除rtpServer
process->setOnDetach(std::move(strong_self->_on_detach));
}
if (!process) { // process 未创建触发rtp server 超时事件
NOTICE_EMIT(BroadcastRtpServerTimeoutArgs, Broadcast::kBroadcastRtpServerTimeout, strong_self->_local_port, strong_self->_stream_id,
(int)strong_self->_tcp_mode, strong_self->_re_use_port, strong_self->_ssrc);
}
}
return 0;
});
}
void cancelDelayTask() {
if (_delay_task) {
_delay_task->cancel();
_delay_task = nullptr;
}
}
private:
void sendRtcp(uint32_t rtp_ssrc, struct sockaddr *addr) {
// 每5秒发送一次rtcp
if (_ticker.elapsedTime() < 5000 || !_process) {
if (_ticker.elapsedTime() < 5000) {
return;
}
_ticker.resetTime();
@ -141,19 +112,14 @@ private:
}
private:
bool _re_use_port = false;
int _only_track = 0;
uint16_t _local_port = 0;
uint32_t _ssrc = 0;
RtpServer::TcpMode _tcp_mode = RtpServer::NONE;
std::function<void()> _timeout_cb;
Ticker _ticker;
Socket::Ptr _rtcp_sock;
RtpProcess::Ptr _process;
std::string _stream_id;
function<void()> _on_detach;
RtpProcess::onDetachCB _on_detach;
std::shared_ptr<struct sockaddr_storage> _rtcp_addr;
EventPoller::DelayTask::Ptr _delay_task;
};
void RtpServer::start(uint16_t local_port, const string &stream_id, TcpMode tcp_mode, const char *local_ip, bool re_use_port, uint32_t ssrc, int only_track, bool multiplex) {
@ -177,22 +143,6 @@ void RtpServer::start(uint16_t local_port, const string &stream_id, TcpMode tcp_
GET_CONFIG(int, udpRecvSocketBuffer, RtpProxy::kUdpRecvSocketBuffer);
SockUtil::setRecvBuf(rtp_socket->rawFD(), udpRecvSocketBuffer);
TcpServer::Ptr tcp_server;
_tcp_mode = tcp_mode;
if (tcp_mode == PASSIVE || tcp_mode == ACTIVE) {
//创建tcp服务器
tcp_server = std::make_shared<TcpServer>(rtp_socket->getPoller());
(*tcp_server)[RtpSession::kStreamID] = stream_id;
(*tcp_server)[RtpSession::kSSRC] = ssrc;
(*tcp_server)[RtpSession::kOnlyTrack] = only_track;
if (tcp_mode == PASSIVE) {
tcp_server->start<RtpSession>(local_port, local_ip);
} else if (stream_id.empty()) {
// tcp主动模式时只能一个端口一个流必须指定流id; 创建TcpServer对象也仅用于传参
throw std::runtime_error(StrPrinter << "tcp主动模式时必需指定流id");
}
}
//创建udp服务器
UdpServer::Ptr udp_server;
RtcpHelper::Ptr helper;
@ -222,13 +172,32 @@ void RtpServer::start(uint16_t local_port, const string &stream_id, TcpMode tcp_
});
} else {
//单端口多线程接收多个流根据ssrc区分流
udp_server = std::make_shared<UdpServer>(rtp_socket->getPoller());
udp_server = std::make_shared<UdpServer>();
(*udp_server)[RtpSession::kOnlyTrack] = only_track;
(*udp_server)[RtpSession::kUdpRecvBuffer] = udpRecvSocketBuffer;
udp_server->start<RtpSession>(local_port, local_ip);
rtp_socket = nullptr;
}
TcpServer::Ptr tcp_server;
if (tcp_mode == PASSIVE || tcp_mode == ACTIVE) {
//创建tcp服务器
tcp_server = std::make_shared<TcpServer>();
(*tcp_server)[RtpSession::kStreamID] = stream_id;
(*tcp_server)[RtpSession::kSSRC] = ssrc;
(*tcp_server)[RtpSession::kOnlyTrack] = only_track;
if (tcp_mode == PASSIVE) {
weak_ptr<RtpServer> weak_self = shared_from_this();
auto processor = helper ? helper->getProcess() : nullptr;
tcp_server->start<RtpSession>(local_port, local_ip, 1024, [weak_self, processor](std::shared_ptr<RtpSession> &session) {
session->setRtpProcess(processor);
});
} else if (stream_id.empty()) {
// tcp主动模式时只能一个端口一个流必须指定流id; 创建TcpServer对象也仅用于传参
throw std::runtime_error(StrPrinter << "tcp主动模式时必需指定流id");
}
}
_on_cleanup = [rtp_socket, stream_id]() {
if (rtp_socket) {
//去除循环引用
@ -240,9 +209,10 @@ void RtpServer::start(uint16_t local_port, const string &stream_id, TcpMode tcp_
_udp_server = udp_server;
_rtp_socket = rtp_socket;
_rtcp_helper = helper;
_tcp_mode = tcp_mode;
}
void RtpServer::setOnDetach(function<void()> cb) {
void RtpServer::setOnDetach(RtpProcess::onDetachCB cb) {
if (_rtcp_helper) {
_rtcp_helper->setOnDetach(std::move(cb));
}
@ -277,6 +247,7 @@ void RtpServer::connectToServer(const std::string &url, uint16_t port, const fun
void RtpServer::onConnect() {
auto rtp_session = std::make_shared<RtpSession>(_rtp_socket);
rtp_session->setRtpProcess(_rtcp_helper->getProcess());
rtp_session->attachServer(*_tcp_server);
_rtp_socket->setOnRead([rtp_session](const Buffer::Ptr &buf, struct sockaddr *addr, int addr_len) {
rtp_session->onRecv(buf);

View File

@ -62,7 +62,7 @@ public:
/**
* RtpProcess onDetach事件回调
*/
void setOnDetach(std::function<void()> cb);
void setOnDetach(RtpProcess::onDetachCB cb);
/**
* ssrc

View File

@ -10,7 +10,7 @@
#if defined(ENABLE_RTPPROXY)
#include "RtpSession.h"
#include "RtpSelector.h"
#include "RtpProcess.h"
#include "Network/TcpServer.h"
#include "Rtsp/Rtsp.h"
#include "Rtsp/RtpReceiver.h"
@ -60,28 +60,24 @@ void RtpSession::onRecv(const Buffer::Ptr &data) {
}
void RtpSession::onError(const SockException &err) {
WarnP(this) << _stream_id << " " << err;
if (_process) {
RtpSelector::Instance().delProcess(_stream_id, _process.get());
_process = nullptr;
if (_emit_detach) {
_process->onDetach(err);
}
WarnP(this) << _stream_id << " " << err;
}
void RtpSession::onManager() {
if (_process && !_process->alive()) {
shutdown(SockException(Err_timeout, "receive rtp timeout"));
}
if (!_process && _ticker.createdTime() > 10 * 1000) {
shutdown(SockException(Err_timeout, "illegal connection"));
}
}
void RtpSession::onRtpPacket(const char *data, size_t len) {
if (_delay_close) {
// 正在延时关闭中,忽略所有数据
return;
void RtpSession::setRtpProcess(RtpProcess::Ptr process) {
_emit_detach = (bool)process;
_process = std::move(process);
}
void RtpSession::onRtpPacket(const char *data, size_t len) {
if (!isRtp(data, len)) {
// 忽略非rtp数据
WarnP(this) << "Not rtp packet";
@ -104,33 +100,31 @@ void RtpSession::onRtpPacket(const char *data, size_t len) {
return;
}
}
if (!_process) {
// 未设置ssrc时尝试获取ssrc
if (!_ssrc && !RtpSelector::getSSRC(data, len, _ssrc)) {
if (!_ssrc && !getSSRC(data, len, _ssrc)) {
return;
}
if (_stream_id.empty()) {
// 未指定流id就使用ssrc为流id
if (_stream_id.empty()) {
_stream_id = printSSRC(_ssrc);
}
try {
_process = RtpSelector::Instance().getProcess(_stream_id, true);
} catch (RtpSelector::ProcessExisted &ex) {
if (!_is_udp) {
// tcp情况下立即断开连接
throw;
}
// udp情况下延时断开连接(等待超时自动关闭)防止频繁创建销毁RtpSession对象
WarnP(this) << ex.what();
_delay_close = true;
return;
}
if (!_process) {
_process = RtpProcess::createProcess(_stream_id);
_process->setOnlyTrack((RtpProcess::OnlyTrack)_only_track);
_process->setDelegate(static_pointer_cast<RtpSession>(shared_from_this()));
weak_ptr<RtpSession> weak_self = static_pointer_cast<RtpSession>(shared_from_this());
_process->setOnDetach([weak_self](const SockException &ex) {
if (auto strong_self = weak_self.lock()) {
strong_self->_process = nullptr;
strong_self->shutdown(ex);
}
});
}
try {
uint32_t rtp_ssrc = 0;
RtpSelector::getSSRC(data, len, rtp_ssrc);
getSSRC(data, len, rtp_ssrc);
if (rtp_ssrc != _ssrc) {
WarnP(this) << "ssrc mismatched, rtp dropped: " << rtp_ssrc << " != " << _ssrc;
return;
@ -143,26 +137,10 @@ void RtpSession::onRtpPacket(const char *data, size_t len) {
} else {
throw;
}
} catch (std::exception &ex) {
if (!_is_udp) {
// tcp情况下立即断开连接
throw;
}
// udp情况下延时断开连接(等待超时自动关闭)防止频繁创建销毁RtpSession对象
WarnP(this) << ex.what();
_delay_close = true;
return;
}
_ticker.resetTime();
}
bool RtpSession::close(MediaSource &sender) {
//此回调在其他线程触发
string err = StrPrinter << "close media: " << sender.getUrl();
safeShutdown(SockException(Err_shutdown, err));
return true;
}
static const char *findSSRC(const char *data, ssize_t len, uint32_t ssrc) {
// rtp前面必须预留两个字节的长度字段
for (ssize_t i = 2; i <= len - 4; ++i) {
@ -268,7 +246,7 @@ const char *RtpSession::searchByPsHeaderFlag(const char *data, size_t len) {
// TODO or Not ? 更新设置ssrc
uint32_t rtp_ssrc = 0;
RtpSelector::getSSRC(rtp_ptr + 2, len, rtp_ssrc);
getSSRC(rtp_ptr + 2, len, rtp_ssrc);
_ssrc = rtp_ssrc;
InfoL << "设置_ssrc为" << _ssrc;
// RtpServer::updateSSRC(uint32_t ssrc)

View File

@ -20,7 +20,7 @@
namespace mediakit{
class RtpSession : public toolkit::Session, public RtpSplitter, public MediaSourceEvent {
class RtpSession : public toolkit::Session, public RtpSplitter {
public:
static const std::string kStreamID;
static const std::string kSSRC;
@ -34,10 +34,9 @@ public:
void onManager() override;
void setParams(toolkit::mINI &ini);
void attachServer(const toolkit::Server &server) override;
void setRtpProcess(RtpProcess::Ptr process);
protected:
// 通知其停止推流
bool close(MediaSource &sender) override;
// 收到rtp回调
void onRtpPacket(const char *data, size_t len) override;
// RtpSplitter override
@ -48,10 +47,10 @@ protected:
const char *searchByPsHeaderFlag(const char *data, size_t len);
private:
bool _delay_close = false;
bool _is_udp = false;
bool _search_rtp = false;
bool _search_rtp_finished = false;
bool _emit_detach = false;
int _only_track = 0;
uint32_t _ssrc = 0;
toolkit::Ticker _ticker;

View File

@ -144,7 +144,16 @@ RtpMultiCaster::RtpMultiCaster(SocketHelper &helper, const string &local_ip, con
});
});
_rtp_reader->setDetachCB([this]() {
string strKey = StrPrinter << local_ip << " " << vhost << " " << app << " " << stream << endl;
_rtp_reader->setDetachCB([this, strKey]() {
{
lock_guard<recursive_mutex> lck(g_mtx);
auto it = g_multi_caster_map.find(strKey);
if (it != g_multi_caster_map.end()) {
g_multi_caster_map.erase(it);
}
}
unordered_map<void *, onDetach> _detach_map_copy;
{
lock_guard<recursive_mutex> lck(_mtx);

View File

@ -470,6 +470,15 @@ string printSSRC(uint32_t ui32Ssrc) {
return tmp;
}
bool getSSRC(const char *data, size_t data_len, uint32_t &ssrc) {
if (data_len < 12) {
return false;
}
uint32_t *ssrc_ptr = (uint32_t *)(data + 8);
ssrc = ntohl(*ssrc_ptr);
return true;
}
bool isRtp(const char *buf, size_t size) {
if (size < 2) {
return false;

View File

@ -317,6 +317,7 @@ toolkit::Buffer::Ptr makeRtpOverTcpPrefix(uint16_t size, uint8_t interleaved);
void makeSockPair(std::pair<toolkit::Socket::Ptr, toolkit::Socket::Ptr> &pair, const std::string &local_ip, bool re_use_port = false, bool is_udp = true);
// 十六进制方式打印ssrc
std::string printSSRC(uint32_t ui32Ssrc);
bool getSSRC(const char *data, size_t data_len, uint32_t &ssrc);
bool isRtp(const char *buf, size_t size);
bool isRtcp(const char *buf, size_t size);

View File

@ -28,6 +28,7 @@ using namespace std;
namespace mediakit {
enum PlayType { type_play = 0, type_pause, type_seek, type_speed };
enum class BeatType : uint32_t { both = 0, rtcp, cmd };
RtspPlayer::RtspPlayer(const EventPoller::Ptr &poller)
: TcpClient(poller) {}
@ -85,6 +86,8 @@ void RtspPlayer::play(const string &strUrl) {
_play_url = url._url;
_rtp_type = (Rtsp::eRtpType)(int)(*this)[Client::kRtpType];
_beat_type = (*this)[Client::kRtspBeatType].as<int>();
_beat_interval_ms = (*this)[Client::kBeatIntervalMS].as<int>();
DebugL << url._url << " " << (url._user.size() ? url._user : "null") << " " << (url._passwd.size() ? url._passwd : "null") << " " << _rtp_type;
weak_ptr<RtspPlayer> weakSelf = static_pointer_cast<RtspPlayer>(shared_from_this());
@ -210,7 +213,8 @@ void RtspPlayer::handleResDESCRIBE(const Parser &parser) {
if (play_track != TrackInvalid) {
auto track = sdpParser.getTrack(play_track);
_sdp_track.emplace_back(track);
sdp = track->toString();
auto title_track = sdpParser.getTrack(TrackTitle);
sdp = (title_track ? title_track->toString() : "") + track->toString();
} else {
_sdp_track = sdpParser.getAvailableTrack();
sdp = sdpParser.toString();
@ -641,23 +645,28 @@ void RtspPlayer::onBeforeRtpSorted(const RtpPacket::Ptr &rtp, int track_idx) {
rtcp_ctx->onRtp(rtp->getSeq(), rtp->getStamp(), rtp->ntp_stamp, rtp->sample_rate, rtp->size() - RtpPacket::kRtpTcpHeaderSize);
auto &ticker = _rtcp_send_ticker[track_idx];
if (ticker.elapsedTime() < 3 * 1000) {
// 时间未到
if (ticker.elapsedTime() < _beat_interval_ms) {
// 心跳时间未到
return;
}
auto &rtcp_flag = _send_rtcp[track_idx];
// 每3秒发送一次心跳rtcp与rtsp信令轮流心跳该特性用于兼容issue:642
// 有些rtsp服务器需要rtcp保活有些需要发送信令保活
// 有些rtsp服务器需要rtcp保活有些需要发送信令保活; rtcp与rtsp信令轮流心跳该特性用于兼容issue:#642
auto &rtcp_flag = _send_rtcp[track_idx];
ticker.resetTime();
switch ((BeatType)_beat_type) {
case BeatType::cmd: rtcp_flag = false; break;
case BeatType::rtcp: rtcp_flag = true; break;
case BeatType::both:
default: rtcp_flag = !rtcp_flag; break;
}
// 发送信令保活
if (!rtcp_flag) {
if (track_idx == 0) {
// 两个track无需同时触发发送信令保活
sendKeepAlive();
}
ticker.resetTime();
// 下次发送rtcp保活
rtcp_flag = true;
return;
}
@ -679,9 +688,6 @@ void RtspPlayer::onBeforeRtpSorted(const RtpPacket::Ptr &rtp, int track_idx) {
rtcp_sdes->chunks.ssrc = htonl(ssrc);
send_rtcp(this, track_idx, std::move(rtcp));
send_rtcp(this, track_idx, RtcpHeader::toBuffer(rtcp_sdes));
ticker.resetTime();
// 下次发送信令保活
rtcp_flag = false;
}
void RtspPlayer::onPlayResult_l(const SockException &ex, bool handshake_done) {

View File

@ -114,6 +114,11 @@ private:
//轮流发送rtcp与GET_PARAMETER保活
bool _send_rtcp[2] = {true, true};
// 心跳类型
uint32_t _beat_type = 0;
// 心跳保护间隔
uint32_t _beat_interval_ms = 0;
std::string _play_url;
std::vector<SdpTrack::Ptr> _sdp_track;
std::function<void(const Parser&)> _on_response;

View File

@ -10,8 +10,10 @@
#include <cstdlib>
#include "RtspSplitter.h"
#include "Util/logger.h"
#include "Util/util.h"
#include "Util/logger.h"
#include "Common/macros.h"
#include "Rtsp/RtpReceiver.h"
using namespace std;
using namespace toolkit;
@ -58,13 +60,28 @@ const char *RtspSplitter::onSearchPacketTail_l(const char *data, size_t len) {
ssize_t RtspSplitter::onRecvHeader(const char *data, size_t len) {
if (_isRtpPacket) {
try {
onRtpPacket(data, len);
} catch (RtpTrack::BadRtpException &ex) {
WarnL << ex.what();
}
return 0;
}
if (len == 4 && !memcmp(data, "\r\n\r\n", 4)) {
return 0;
}
try {
_parser.parse(data, len);
} catch (mediakit::AssertFailedException &ex){
if (!_enableRecvRtp) {
// 还在握手中,直接中断握手
throw;
}
// 握手已经结束如果rtsp server存在发送缓存溢出的bug那么rtsp信令可能跟rtp混在一起
// 这种情况下rtsp信令解析异常不中断链接只丢弃这个包
WarnL << ex.what();
return 0;
}
auto ret = getContentLength(_parser);
if (ret == 0) {
onWholeRtspPacket(_parser);

View File

@ -20,7 +20,7 @@
#include "Thread/WorkThreadPool.h"
#include "Pusher/MediaPusher.h"
#include "Player/PlayerProxy.h"
#include "Record/MP4Reader.h"
using namespace std;
using namespace toolkit;
using namespace mediakit;
@ -52,7 +52,7 @@ public:
Option::ArgRequired,/*该选项后面必须跟值*/
nullptr,/*该选项默认值*/
true,/*该选项是否必须赋值如果没有默认值且为ArgRequired时用户必须提供该参数否则将抛异常*/
"拉流url,支持rtsp/rtmp/hls",/*该选项说明文字*/
"拉流url,支持rtsp/rtmp/hls/mp4文件",/*该选项说明文字*/
nullptr);
(*_parser) << Option('o',/*该选项简称,如果是\x00则说明无简称*/
@ -98,9 +98,7 @@ public:
~CMD_main() override {}
const char *description() const override {
return "主程序命令参数";
}
const char *description() const override { return "主程序命令参数"; }
};
// 此程序用于推流性能测试
@ -129,6 +127,8 @@ int main(int argc, char *argv[]) {
cout << "推流协议只支持rtsp或rtmp" << endl;
return -1;
}
const std::string app = "app";
const std::string stream = "test";
//设置日志
Logger::Instance().add(std::make_shared<ConsoleChannel>("ConsoleChannel", logLevel));
@ -145,22 +145,39 @@ int main(int argc, char *argv[]) {
ProtocolOption option;
option.enable_hls = false;
option.enable_mp4 = false;
MediaSource::Ptr src = nullptr;
PlayerProxy::Ptr proxy = nullptr;;
if (end_with(in_url, ".mp4")) {
// create MediaSource from mp4file
auto reader = std::make_shared<MP4Reader>(DEFAULT_VHOST, app, stream, in_url);
//mp4 repeat
reader->startReadMP4(0, true, true);
src = MediaSource::find(schema, DEFAULT_VHOST, app, stream, false);
if (!src) {
// mp4文件不存在
WarnL << "no such file or directory: " << in_url;
return -1;
}
} else {
//添加拉流代理
auto proxy = std::make_shared<PlayerProxy>(DEFAULT_VHOST, "app", "test", option);
proxy = std::make_shared<PlayerProxy>(DEFAULT_VHOST, app, stream, option);
//rtsp拉流代理方式
(*proxy)[Client::kRtpType] = rtp_type;
//开始拉流代理
proxy->play(in_url);
auto get_src = [schema]() {
return MediaSource::find(schema, DEFAULT_VHOST, "app", "test", false);
}
auto get_src = [schema,app,stream]() {
return MediaSource::find(schema, DEFAULT_VHOST, app, stream, false);
};
//推流器map
recursive_mutex mtx;
unordered_map<void *, MediaPusher::Ptr> pusher_map;
auto add_pusher = [&](const MediaSource::Ptr &src, const string &rand_str, size_t index) {
auto pusher = std::make_shared<MediaPusher>(src);
auto tag = pusher.get();

View File

@ -17,7 +17,7 @@
#include "Rtsp/RtspSession.h"
#include "Rtmp/RtmpSession.h"
#include "Http/HttpSession.h"
#include "Rtp/RtpSelector.h"
#include "Rtp/RtpProcess.h"
using namespace std;
using namespace toolkit;
@ -42,7 +42,7 @@ static bool loadFile(const char *path, const EventPoller::Ptr &poller) {
memset(&addr, 0, sizeof(addr));
addr.ss_family = AF_INET;
auto sock = Socket::createSocket(poller);
auto process = RtpSelector::Instance().getProcess("test", true);
auto process = RtpProcess::createProcess("test");
uint64_t stamp_last = 0;
auto total_size = std::make_shared<size_t>(0);
@ -89,7 +89,6 @@ static bool loadFile(const char *path, const EventPoller::Ptr &poller) {
auto ret = do_read();
if (!ret) {
WarnL << *total_size / 1024 << "KB";
RtpSelector::Instance().delProcess("test", process.get());
}
return ret;
});

View File

@ -383,7 +383,7 @@ namespace RTC
});
// Set ciphers.
ret = SSL_CTX_set_cipher_list(
sslCtx, "DEFAULT:!NULL:!aNULL:!SHA256:!SHA384:!aECDH:!AESGCM+AES256:!aPSK");
sslCtx, "DEFAULT:!NULL:!aNULL:!SHA256:!SHA384:!aECDH:!AESGCM+AES256:!aPSK:!RC4");
if (ret == 0)
{

View File

@ -9,24 +9,56 @@
*/
#include "Nack.h"
#include "Common/config.h"
using namespace std;
using namespace toolkit;
namespace mediakit {
static constexpr uint32_t kMaxNackMS = 5 * 1000;
static constexpr uint32_t kRtpCacheCheckInterval = 100;
// RTC配置项目
namespace Rtc {
#define RTC_FIELD "rtc."
//~ nack接收端
// Nack缓存包最早时间间隔
const string kMaxNackMS = RTC_FIELD "maxNackMS";
// Nack包检查间隔(包数量)
const string kRtpCacheCheckInterval = RTC_FIELD "rtpCacheCheckInterval";
//~ nack发送端
//最大保留的rtp丢包状态个数
const string kNackMaxSize = RTC_FIELD "nackMaxSize";
// rtp丢包状态最长保留时间
const string kNackMaxMS = RTC_FIELD "nackMaxMS";
// nack最多请求重传次数
const string kNackMaxCount = RTC_FIELD "nackMaxCount";
// nack重传频率rtt的倍数
const string kNackIntervalRatio = RTC_FIELD "nackIntervalRatio";
// nack包中rtp个数减小此值可以让nack包响应更灵敏
const string kNackRtpSize = RTC_FIELD "nackRtpSize";
static onceToken token([]() {
mINI::Instance()[kMaxNackMS] = 5 * 1000;
mINI::Instance()[kRtpCacheCheckInterval] = 100;
mINI::Instance()[kNackMaxSize] = 2048;
mINI::Instance()[kNackMaxMS] = 3 * 1000;
mINI::Instance()[kNackMaxCount] = 15;
mINI::Instance()[kNackIntervalRatio] = 1.0f;
mINI::Instance()[kNackRtpSize] = 8;
});
} // namespace Rtc
void NackList::pushBack(RtpPacket::Ptr rtp) {
auto seq = rtp->getSeq();
_nack_cache_seq.emplace_back(seq);
_nack_cache_pkt.emplace(seq, std::move(rtp));
if (++_cache_ms_check < kRtpCacheCheckInterval) {
GET_CONFIG(uint32_t, rtpcache_checkinterval, Rtc::kRtpCacheCheckInterval);
if (++_cache_ms_check < rtpcache_checkinterval) {
return;
}
_cache_ms_check = 0;
while (getCacheMS() >= kMaxNackMS) {
GET_CONFIG(uint32_t, maxnackms, Rtc::kMaxNackMS);
while (getCacheMS() >= maxnackms) {
// 需要清除部分nack缓存
popFront();
}
@ -148,10 +180,13 @@ void NackContext::makeNack(uint16_t max_seq, bool flush) {
eraseFrontSeq();
// 最多生成5个nack包防止seq大幅跳跃导致一直循环
auto max_nack = 5u;
GET_CONFIG(uint32_t, nack_rtpsize, Rtc::kNackRtpSize);
// kNackRtpSize must between 0 and 16
nack_rtpsize = std::min<uint32_t>(nack_rtpsize, FCI_NACK::kBitSize);
while (_nack_seq != max_seq && max_nack--) {
// 一次不能发送超过16+1个rtp的状态
uint16_t nack_rtp_count = std::min<uint16_t>(FCI_NACK::kBitSize, max_seq - (uint16_t)(_nack_seq + 1));
if (!flush && nack_rtp_count < kNackRtpSize) {
if (!flush && nack_rtp_count < nack_rtpsize) {
// 非flush状态下seq个数不足以发送一次nack
break;
}
@ -206,7 +241,9 @@ void NackContext::clearNackStatus(uint16_t seq) {
_nack_send_status.erase(it);
// 限定rtt在合理有效范围内
_rtt = max<int>(10, min<int>(rtt, kNackMaxMS / kNackMaxCount));
GET_CONFIG(uint32_t, nack_maxms, Rtc::kNackMaxMS);
GET_CONFIG(uint32_t, nack_maxcount, Rtc::kNackMaxCount);
_rtt = max<int>(10, min<int>(rtt, nack_maxms / nack_maxcount));
}
void NackContext::recordNack(const FCI_NACK &nack) {
@ -222,7 +259,8 @@ void NackContext::recordNack(const FCI_NACK &nack) {
++i;
}
// 记录太多了,移除一部分早期的记录
while (_nack_send_status.size() > kNackMaxSize) {
GET_CONFIG(uint32_t, nack_maxsize, Rtc::kNackMaxSize);
while (_nack_send_status.size() > nack_maxsize) {
_nack_send_status.erase(_nack_send_status.begin());
}
}
@ -230,13 +268,16 @@ void NackContext::recordNack(const FCI_NACK &nack) {
uint64_t NackContext::reSendNack() {
set<uint16_t> nack_rtp;
auto now = getCurrentMillisecond();
GET_CONFIG(uint32_t, nack_maxms, Rtc::kNackMaxMS);
GET_CONFIG(uint32_t, nack_maxcount, Rtc::kNackMaxCount);
GET_CONFIG(float, nack_intervalratio, Rtc::kNackIntervalRatio);
for (auto it = _nack_send_status.begin(); it != _nack_send_status.end();) {
if (now - it->second.first_stamp > kNackMaxMS) {
if (now - it->second.first_stamp > nack_maxms) {
// 该rtp丢失太久了不再要求重传
it = _nack_send_status.erase(it);
continue;
}
if (now - it->second.update_stamp < kNackIntervalRatio * _rtt) {
if (now - it->second.update_stamp < nack_intervalratio * _rtt) {
// 距离上次nack不足2倍的rtt不用再发送nack
++it;
continue;
@ -245,7 +286,7 @@ uint64_t NackContext::reSendNack() {
nack_rtp.emplace(it->first);
// 更新nack发送时间戳
it->second.update_stamp = now;
if (++(it->second.nack_count) == kNackMaxCount) {
if (++(it->second.nack_count) == nack_maxcount) {
// nack次数太多移除之
it = _nack_send_status.erase(it);
continue;

View File

@ -41,18 +41,6 @@ class NackContext {
public:
using Ptr = std::shared_ptr<NackContext>;
using onNack = std::function<void(const FCI_NACK &nack)>;
//最大保留的rtp丢包状态个数
static constexpr auto kNackMaxSize = 2048;
// rtp丢包状态最长保留时间
static constexpr auto kNackMaxMS = 3 * 1000;
// nack最多请求重传10次
static constexpr auto kNackMaxCount = 15;
// nack重传频率rtt的倍数
static constexpr auto kNackIntervalRatio = 1.0f;
// nack包中rtp个数减小此值可以让nack包响应更灵敏
static constexpr auto kNackRtpSize = 8;
static_assert(kNackRtpSize >=0 && kNackRtpSize <= FCI_NACK::kBitSize, "NackContext::kNackRtpSize must between 0 and 16");
NackContext();

View File

@ -57,6 +57,9 @@ const string kMinBitrate = RTC_FIELD "min_bitrate";
// 数据通道设置
const string kDataChannelEcho = RTC_FIELD "datachannel_echo";
// rtp丢包状态最长保留时间
const string kNackMaxMS = RTC_FIELD "nackMaxMS";
static onceToken token([]() {
mINI::Instance()[kTimeOutSec] = 15;
mINI::Instance()[kExternIP] = "";
@ -69,6 +72,8 @@ static onceToken token([]() {
mINI::Instance()[kMinBitrate] = 0;
mINI::Instance()[kDataChannelEcho] = true;
mINI::Instance()[kNackMaxMS] = 3 * 1000;
});
} // namespace RTC
@ -800,7 +805,8 @@ public:
_on_nack = std::move(on_nack);
setOnSorted(std::move(cb));
//设置jitter buffer参数
RtpTrackImp::setParams(1024, NackContext::kNackMaxMS, 512);
GET_CONFIG(uint32_t, nack_maxms, Rtc::kNackMaxMS);
RtpTrackImp::setParams(1024, nack_maxms, 512);
_nack_ctx.setOnNack([this](const FCI_NACK &nack) { onNack(nack); });
}

View File

@ -1,16 +0,0 @@
*.iml
.gradle
/local.properties
/.idea/caches
/.idea/libraries
/.idea/modules.xml
/.idea/workspace.xml
/.idea/navEditor.xml
/.idea/assetWizardSettings.xml
.DS_Store
/build
/captures
.externalNativeBuild
.cxx
local.properties
/.idea/

Binary file not shown.

View File

@ -1,2 +0,0 @@
/build
.cxx

View File

@ -1,54 +0,0 @@
plugins {
id 'com.android.application'
id 'org.jetbrains.kotlin.android'
id 'kotlin-android-extensions'
id 'kotlin-kapt'
}
apply plugin: 'kotlin-android'
android {
compileSdk 32
defaultConfig {
applicationId "com.zlmediakit.webrtc"
minSdk 21
targetSdk 32
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
kotlinOptions {
jvmTarget = '1.8'
}
}
dependencies {
implementation 'androidx.core:core-ktx:1.7.0'
implementation 'androidx.appcompat:appcompat:1.5.1'
implementation 'com.google.android.material:material:1.6.1'
implementation 'androidx.constraintlayout:constraintlayout:2.1.4'
testImplementation 'junit:junit:4.13.2'
androidTestImplementation 'androidx.test.ext:junit:1.1.3'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0'
implementation 'com.google.code.gson:gson:2.8.9'
implementation("com.squareup.okhttp3:okhttp:4.10.0")
implementation "org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version"
implementation 'org.webrtc:google-webrtc:1.0.32006'
}

View File

@ -1,21 +0,0 @@
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile

View File

@ -1,24 +0,0 @@
package com.zlmediakit.webrtc
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.ext.junit.runners.AndroidJUnit4
import org.junit.Test
import org.junit.runner.RunWith
import org.junit.Assert.*
/**
* Instrumented test, which will execute on an Android device.
*
* See [testing documentation](http://d.android.com/tools/testing).
*/
@RunWith(AndroidJUnit4::class)
class ExampleInstrumentedTest {
@Test
fun useAppContext() {
// Context of the app under test.
val appContext = InstrumentationRegistry.getInstrumentation().targetContext
assertEquals("com.zlmediakit.webrtc", appContext.packageName)
}
}

View File

@ -1,46 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="com.zlmediakit.webrtc">
<uses-feature android:name="android.hardware.camera"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
<uses-feature
android:glEsVersion="0x00020000"
android:required="true"/>
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.CAPTURE_VIDEO_OUTPUT"/>
<uses-permission android:name="android.permission.READ_PHONE_STATE"/>
<application
android:allowBackup="true"
android:dataExtractionRules="@xml/data_extraction_rules"
android:fullBackupContent="@xml/backup_rules"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/Theme.AndroidWebRTC"
android:usesCleartextTraffic="true"
tools:targetApi="31">
<activity
android:name=".MainActivity"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
</manifest>

View File

@ -1,79 +0,0 @@
package com.zlmediakit.webrtc
import android.annotation.SuppressLint
import android.graphics.drawable.BitmapDrawable
import android.graphics.drawable.Drawable
import android.os.Bundle
import android.widget.Toast
import androidx.appcompat.app.AppCompatActivity
import kotlinx.android.synthetic.main.activity_main.*
import kotlinx.android.synthetic.main.activity_main.view.*
class MainActivity : AppCompatActivity() {
private var isSpeaker = true
@SuppressLint("SetTextI18n")
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
lifecycle.addObserver(web_rtc_sv)
//http://124.223.98.45/index/api/webrtc?app=live&stream=test&type=play
url.setText("http://124.223.98.45/index/api/webrtc?app=live&stream=test&type=play")
//http://192.168.1.17/index/api/webrtc?app=live&stream=test&type=play
btn_play.setOnClickListener {
web_rtc_sv?.setVideoPath(url.text.toString())
web_rtc_sv.start()
}
web_rtc_sv.setOnErrorListener { errorCode, errorMsg ->
runOnUiThread {
Toast.makeText(this, "errorCode:$errorCode,errorMsg:$errorMsg", Toast.LENGTH_SHORT)
.show()
}
}
btn_pause.setOnClickListener {
web_rtc_sv?.pause()
}
btn_resume.setOnClickListener {
web_rtc_sv?.resume()
}
btn_screenshot.setOnClickListener {
web_rtc_sv?.screenshot {
runOnUiThread {
iv_screen.setImageDrawable(BitmapDrawable(it))
}
}
}
btn_mute.setOnClickListener {
web_rtc_sv.mute(true)
}
selectAudio()
btn_speaker.setOnClickListener {
selectAudio()
}
}
fun selectAudio(){
if (isSpeaker){
btn_speaker.setText("扬声器")
web_rtc_sv.setSpeakerphoneOn(isSpeaker)
}else{
btn_speaker.setText("话筒")
web_rtc_sv.setSpeakerphoneOn(isSpeaker)
}
isSpeaker=!isSpeaker
}
}

View File

@ -1,439 +0,0 @@
package com.zlmediakit.webrtc
import android.content.Context
import android.graphics.Bitmap
import android.media.AudioManager
import android.util.AttributeSet
import android.util.Log
import android.view.LayoutInflater
import android.widget.RelativeLayout
import androidx.lifecycle.DefaultLifecycleObserver
import androidx.lifecycle.LifecycleOwner
import com.google.gson.Gson
import okhttp3.*
import okhttp3.MediaType.Companion.toMediaType
import okhttp3.MediaType.Companion.toMediaTypeOrNull
import org.webrtc.*
import org.webrtc.RendererCommon.ScalingType
import org.webrtc.audio.AudioDeviceModule
import org.webrtc.audio.JavaAudioDeviceModule
import java.io.IOException
import java.util.*
public class WebRTCSurfaceView(context: Context, attrs: AttributeSet?) :
RelativeLayout(context, attrs), DefaultLifecycleObserver, RendererCommon.RendererEvents {
private data class sdp(var sdp: String, var username: String, var password: String)
private data class SdpResponse(var code: Int, var id: String, var sdp: String, var type: String)
private enum class ErrorCode(val errorCode: Int) {
SUCCESS(0x00),
GET_REMOTE_SDP_ERROR(0x01);
}
companion object {
private val TAG = "WebRTCSurfaceView"
}
private var mContext: Context = context
private val eglBase: EglBase = EglBase.create()
private var mEGLBaseContext: EglBase.Context = eglBase.eglBaseContext
private lateinit var videoUrl: String;
private var mPeerConnectionFactory: PeerConnectionFactory? = null
private var mLocalMediaStream: MediaStream? = null
private var mLocalAudioTrack: AudioTrack? = null
private var mAudioSource: AudioSource? = null
private var mLocalSessionDescription: SessionDescription? = null
private var mRemoteSessionDescription: SessionDescription? = null
private var mLocalPeer: Peer? = null
private var mSurfaceViewRenderer: SurfaceViewRenderer
private lateinit var OnErrorListener: (errorCode: Int, errorMsg: String) -> Unit?
fun setOnErrorListener(listener: (errorCode: Int, errorMsg: String) -> Unit) {
this.OnErrorListener = listener
}
private lateinit var OnPreparedListener: () -> Unit?
fun setOnPreparedListener(listener: () -> Unit) {
this.OnPreparedListener = listener
}
private val audioManager: AudioManager
init {
val view = LayoutInflater.from(mContext).inflate(R.layout.layout_videoview, this)
mPeerConnectionFactory = createConnectionFactory()
mSurfaceViewRenderer = view.findViewById(R.id.surface_view_renderer)
mSurfaceViewRenderer.init(mEGLBaseContext, this)
mSurfaceViewRenderer.setScalingType(ScalingType.SCALE_ASPECT_FILL)
mSurfaceViewRenderer.setEnableHardwareScaler(true)
//创建媒体流
mLocalMediaStream = mPeerConnectionFactory?.createLocalMediaStream("ARDAMS")
//采集音频
mAudioSource = mPeerConnectionFactory?.createAudioSource(createAudioConstraints())
mLocalAudioTrack = mPeerConnectionFactory?.createAudioTrack("ARDAMSa0", mAudioSource)
//添加Tracks
mLocalMediaStream?.addTrack(mLocalAudioTrack)
audioManager = context.getSystemService(Context.AUDIO_SERVICE) as AudioManager
audioManager.isSpeakerphoneOn = false
}
private fun set(width: Int, height: Int) {
layoutParams.width = width
layoutParams.height = height
}
private fun createConnectionFactory(): PeerConnectionFactory? {
val options = PeerConnectionFactory.InitializationOptions.builder(mContext)
.setEnableInternalTracer(false)
.createInitializationOptions()
PeerConnectionFactory.initialize(options)
val videoEncoderFactory = DefaultVideoEncoderFactory(
mEGLBaseContext,
true,
true
)
val videoDecoderFactory = DefaultVideoDecoderFactory(mEGLBaseContext)
val audioDevice = createJavaAudioDevice()
val peerConnectionFactory = PeerConnectionFactory.builder()
.setAudioDeviceModule(audioDevice)
.setVideoEncoderFactory(videoEncoderFactory)
.setVideoDecoderFactory(videoDecoderFactory)
.createPeerConnectionFactory()
audioDevice.release()
return peerConnectionFactory
}
private fun createAudioConstraints(): MediaConstraints {
val audioConstraints = MediaConstraints()
audioConstraints.mandatory.add(
MediaConstraints.KeyValuePair(
"googEchoCancellation",
"true"
)
)
audioConstraints.mandatory.add(
MediaConstraints.KeyValuePair(
"googAutoGainControl",
"false"
)
)
audioConstraints.mandatory.add(
MediaConstraints.KeyValuePair(
"googHighpassFilter",
"true"
)
)
audioConstraints.mandatory.add(
MediaConstraints.KeyValuePair(
"googNoiseSuppression",
"true"
)
)
return audioConstraints
}
private fun offerOrAnswerConstraint(): MediaConstraints {
val mediaConstraints = MediaConstraints()
val keyValuePairs = java.util.ArrayList<MediaConstraints.KeyValuePair>()
keyValuePairs.add(MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"))
keyValuePairs.add(MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"))
mediaConstraints.mandatory.addAll(keyValuePairs)
return mediaConstraints
}
private fun createJavaAudioDevice(): AudioDeviceModule {
val audioTrackErrorCallback: JavaAudioDeviceModule.AudioTrackErrorCallback = object :
JavaAudioDeviceModule.AudioTrackErrorCallback {
override fun onWebRtcAudioTrackInitError(errorMessage: String) {
Log.i(TAG, "onWebRtcAudioTrackInitError ============> $errorMessage")
}
override fun onWebRtcAudioTrackStartError(
errorCode: JavaAudioDeviceModule.AudioTrackStartErrorCode, errorMessage: String
) {
Log.i(TAG, "onWebRtcAudioTrackStartError ============> $errorCode:$errorMessage")
}
override fun onWebRtcAudioTrackError(errorMessage: String) {
Log.i(TAG, "onWebRtcAudioTrackError ============> $errorMessage")
}
}
// Set audio track state callbacks.
val audioTrackStateCallback: JavaAudioDeviceModule.AudioTrackStateCallback = object :
JavaAudioDeviceModule.AudioTrackStateCallback {
override fun onWebRtcAudioTrackStart() {
Log.i(TAG, "onWebRtcAudioTrackStart ============>")
}
override fun onWebRtcAudioTrackStop() {
Log.i(TAG, "onWebRtcAudioTrackStop ============>")
}
}
return JavaAudioDeviceModule.builder(mContext)
.setUseHardwareAcousticEchoCanceler(true)
.setUseHardwareNoiseSuppressor(true)
.setAudioTrackErrorCallback(audioTrackErrorCallback)
.setAudioTrackStateCallback(audioTrackStateCallback)
.setUseStereoOutput(true) //立体声
.createAudioDeviceModule()
}
fun setVideoPath(url: String) {
videoUrl = url
}
fun start() {
mLocalPeer = Peer {
val okHttpClient = OkHttpClient.Builder().build()
val body = RequestBody.create("text/plain; charset=utf-8".toMediaType(), it!!)
val request: Request = Request.Builder()
.url(videoUrl)
.post(body)
.build()
val call: Call = okHttpClient.newCall(request)
call.enqueue(object : Callback {
override fun onFailure(call: Call, e: IOException) {
Log.i(TAG, "onFailure")
OnErrorListener?.invoke(
ErrorCode.GET_REMOTE_SDP_ERROR.errorCode,
e.message.toString()
)
}
override fun onResponse(call: Call, response: Response) {
val body = response.body?.string()
val sdpResponse = Gson().fromJson(body, SdpResponse::class.java)
try {
mRemoteSessionDescription = SessionDescription(
SessionDescription.Type.fromCanonicalForm("answer"),
sdpResponse.sdp
)
Log.i(
TAG,
"RemoteSdpObserver onCreateSuccess:[SessionDescription[type=${mRemoteSessionDescription?.type?.name},description=${mRemoteSessionDescription?.description}]]"
)
mLocalPeer?.setRemoteDescription(mRemoteSessionDescription!!)
} catch (e: Exception) {
Log.i(TAG, e.toString())
OnErrorListener.invoke(
ErrorCode.GET_REMOTE_SDP_ERROR.errorCode,
e.localizedMessage
)
}
}
})
}
}
fun pause() {
mSurfaceViewRenderer.pauseVideo()
//mSurfaceViewRenderer.disableFpsReduction()
}
fun resume() {
mSurfaceViewRenderer.setFpsReduction(15f)
}
fun screenshot(listener: (bitmap: Bitmap) -> Unit) {
mSurfaceViewRenderer.addFrameListener({
listener.invoke(it)
}, 1f)
}
fun setSpeakerphoneOn(on: Boolean) {
audioManager.isSpeakerphoneOn = on
}
fun mute(on:Boolean) {
audioManager.isMicrophoneMute=on
}
override fun onDestroy(owner: LifecycleOwner) {
super.onDestroy(owner)
mSurfaceViewRenderer.release()
mLocalPeer?.mPeerConnection?.dispose()
mAudioSource?.dispose()
mPeerConnectionFactory?.dispose()
}
override fun onMeasure(widthMeasureSpec: Int, heightMeasureSpec: Int) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec)
}
inner class Peer(var sdp: (String?) -> Unit = {}) : PeerConnection.Observer, SdpObserver {
var mPeerConnection: PeerConnection? = null
init {
mPeerConnection = createPeerConnection()
mPeerConnection?.createOffer(this, offerOrAnswerConstraint())
}
//初始化 RTCPeerConnection 连接管道
private fun createPeerConnection(): PeerConnection? {
if (mPeerConnectionFactory == null) {
mPeerConnectionFactory = createConnectionFactory()
}
// 管道连接抽象类实现方法
val ICEServers = LinkedList<PeerConnection.IceServer>()
val rtcConfig = PeerConnection.RTCConfiguration(ICEServers)
//修改模式 PlanB无法使用仅接收音视频的配置
//rtcConfig.sdpSemantics = PeerConnection.SdpSemantics.PLAN_B
return mPeerConnectionFactory?.createPeerConnection(rtcConfig, this)
}
fun setRemoteDescription(sdp: SessionDescription) {
mPeerConnection?.setRemoteDescription(this, sdp)
}
override fun onCreateSuccess(sessionDescription: SessionDescription?) {
mPeerConnection?.setLocalDescription(this, sessionDescription)
mPeerConnection?.addStream(mLocalMediaStream)
sdp.invoke(sessionDescription?.description)
}
override fun onSetSuccess() {
}
override fun onCreateFailure(p0: String?) {
}
override fun onSetFailure(p0: String?) {
}
override fun onSignalingChange(signalingState: PeerConnection.SignalingState?) {
Log.i(TAG, "onSignalingChange ============> " + signalingState.toString())
}
override fun onIceConnectionChange(iceConnectionState: PeerConnection.IceConnectionState?) {
Log.i(TAG, "onIceConnectionChange ============> " + iceConnectionState.toString())
}
override fun onIceConnectionReceivingChange(p0: Boolean) {
Log.i(TAG, "onIceConnectionReceivingChange ============> $p0")
}
override fun onIceGatheringChange(iceGatheringState: PeerConnection.IceGatheringState?) {
Log.i(TAG, "onIceGatheringChange ============> ${iceGatheringState.toString()}")
}
override fun onIceCandidate(iceCandidate: IceCandidate?) {
Log.i(TAG, "onIceCandidate ============> ${iceCandidate.toString()}")
}
override fun onIceCandidatesRemoved(p0: Array<out IceCandidate>?) {
Log.i(TAG, "onIceCandidatesRemoved ============> ${p0.toString()}")
}
override fun onAddStream(mediaStream: MediaStream?) {
Log.i(TAG, "onAddStream ============> ${mediaStream?.toString()}")
if (mediaStream?.videoTracks?.isEmpty() != true) {
val remoteVideoTrack = mediaStream?.videoTracks?.get(0)
remoteVideoTrack?.setEnabled(true)
remoteVideoTrack?.addSink(mSurfaceViewRenderer)
}
if (mediaStream?.audioTracks?.isEmpty() != true) {
val remoteAudioTrack = mediaStream?.audioTracks?.get(0)
remoteAudioTrack?.setEnabled(true)
remoteAudioTrack?.setVolume(1.0)
}
}
override fun onRemoveStream(mediaStream: MediaStream?) {
Log.i(TAG, "onRemoveStream ============> ${mediaStream.toString()}")
}
override fun onDataChannel(dataChannel: DataChannel?) {
Log.i(TAG, "onDataChannel ============> ${dataChannel.toString()}")
}
override fun onRenegotiationNeeded() {
Log.i(TAG, "onRenegotiationNeeded ============>")
}
override fun onAddTrack(rtpReceiver: RtpReceiver?, p1: Array<out MediaStream>?) {
Log.i(TAG, "onAddTrack ============>" + rtpReceiver?.track())
Log.i(TAG, "onAddTrack ============>" + p1?.size)
}
}
override fun onFirstFrameRendered() {
Log.i(TAG, "onFirstFrameRendered ============>")
}
override fun onFrameResolutionChanged(frameWidth: Int, frameHeight: Int, rotation: Int) {
Log.i(TAG, "onFrameResolutionChanged ============> $frameWidth:$frameHeight:$rotation")
//set(frameWidth,frameHeight)
}
}

View File

@ -1,30 +0,0 @@
<vector xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:aapt="http://schemas.android.com/aapt"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path android:pathData="M31,63.928c0,0 6.4,-11 12.1,-13.1c7.2,-2.6 26,-1.4 26,-1.4l38.1,38.1L107,108.928l-32,-1L31,63.928z">
<aapt:attr name="android:fillColor">
<gradient
android:endX="85.84757"
android:endY="92.4963"
android:startX="42.9492"
android:startY="49.59793"
android:type="linear">
<item
android:color="#44000000"
android:offset="0.0" />
<item
android:color="#00000000"
android:offset="1.0" />
</gradient>
</aapt:attr>
</path>
<path
android:fillColor="#FFFFFF"
android:fillType="nonZero"
android:pathData="M65.3,45.828l3.8,-6.6c0.2,-0.4 0.1,-0.9 -0.3,-1.1c-0.4,-0.2 -0.9,-0.1 -1.1,0.3l-3.9,6.7c-6.3,-2.8 -13.4,-2.8 -19.7,0l-3.9,-6.7c-0.2,-0.4 -0.7,-0.5 -1.1,-0.3C38.8,38.328 38.7,38.828 38.9,39.228l3.8,6.6C36.2,49.428 31.7,56.028 31,63.928h46C76.3,56.028 71.8,49.428 65.3,45.828zM43.4,57.328c-0.8,0 -1.5,-0.5 -1.8,-1.2c-0.3,-0.7 -0.1,-1.5 0.4,-2.1c0.5,-0.5 1.4,-0.7 2.1,-0.4c0.7,0.3 1.2,1 1.2,1.8C45.3,56.528 44.5,57.328 43.4,57.328L43.4,57.328zM64.6,57.328c-0.8,0 -1.5,-0.5 -1.8,-1.2s-0.1,-1.5 0.4,-2.1c0.5,-0.5 1.4,-0.7 2.1,-0.4c0.7,0.3 1.2,1 1.2,1.8C66.5,56.528 65.6,57.328 64.6,57.328L64.6,57.328z"
android:strokeWidth="1"
android:strokeColor="#00000000" />
</vector>

View File

@ -1,170 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<vector xmlns:android="http://schemas.android.com/apk/res/android"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path
android:fillColor="#3DDC84"
android:pathData="M0,0h108v108h-108z" />
<path
android:fillColor="#00000000"
android:pathData="M9,0L9,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,0L19,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,0L29,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,0L39,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,0L49,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,0L59,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,0L69,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,0L79,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M89,0L89,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M99,0L99,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,9L108,9"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,19L108,19"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,29L108,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,39L108,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,49L108,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,59L108,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,69L108,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,79L108,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,89L108,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,99L108,99"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,29L89,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,39L89,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,49L89,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,59L89,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,69L89,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,79L89,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,19L29,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,19L39,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,19L49,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,19L59,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,19L69,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,19L79,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
</vector>

View File

@ -1,93 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<com.zlmediakit.webrtc.WebRTCSurfaceView
android:id="@+id/web_rtc_sv"
android:layout_width="match_parent"
android:layout_height="200dp"
app:layout_constraintTop_toTopOf="parent" />
<androidx.appcompat.widget.AppCompatEditText
android:id="@+id/url"
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:layout_constraintTop_toBottomOf="@+id/web_rtc_sv"
android:text=""/>
<LinearLayout
android:id="@+id/ll"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:layout_marginTop="30dp"
app:layout_constraintTop_toBottomOf="@+id/url">
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_play"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="播放" />
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_pause"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="暂停" />
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_resume"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="恢复" />
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_speaker"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="扬声器" />
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_mute"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="静音" />
</LinearLayout>
<LinearLayout
android:id="@+id/ll2"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:layout_marginTop="10dp"
app:layout_constraintTop_toBottomOf="@+id/ll">
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_screenshot"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="截图" />
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_screen_record"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="录制" />
</LinearLayout>
<androidx.appcompat.widget.AppCompatImageView
android:id="@+id/iv_screen"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:layout_constraintBottom_toBottomOf="parent"
tools:ignore="MissingConstraints" />
</androidx.constraintlayout.widget.ConstraintLayout>

View File

@ -1,13 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:app="http://schemas.android.com/apk/res-auto">
<org.webrtc.SurfaceViewRenderer
android:id="@+id/surface_view_renderer"
android:layout_width="wrap_content"
android:layout_height="wrap_content"/>
</RelativeLayout>

View File

@ -1,5 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>

View File

@ -1,5 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 982 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.6 KiB

View File

@ -1,16 +0,0 @@
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.AndroidWebRTC" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
<item name="colorPrimary">@color/purple_200</item>
<item name="colorPrimaryVariant">@color/purple_700</item>
<item name="colorOnPrimary">@color/black</item>
<!-- Secondary brand color. -->
<item name="colorSecondary">@color/teal_200</item>
<item name="colorSecondaryVariant">@color/teal_200</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor" tools:targetApi="l">?attr/colorPrimaryVariant</item>
<!-- Customize your theme here. -->
</style>
</resources>

View File

@ -1,10 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<color name="purple_200">#FFBB86FC</color>
<color name="purple_500">#FF6200EE</color>
<color name="purple_700">#FF3700B3</color>
<color name="teal_200">#FF03DAC5</color>
<color name="teal_700">#FF018786</color>
<color name="black">#FF000000</color>
<color name="white">#FFFFFFFF</color>
</resources>

View File

@ -1,3 +0,0 @@
<resources>
<string name="app_name">AndroidWebRTC</string>
</resources>

View File

@ -1,16 +0,0 @@
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.AndroidWebRTC" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
<item name="colorPrimary">@color/purple_500</item>
<item name="colorPrimaryVariant">@color/purple_700</item>
<item name="colorOnPrimary">@color/white</item>
<!-- Secondary brand color. -->
<item name="colorSecondary">@color/teal_200</item>
<item name="colorSecondaryVariant">@color/teal_700</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor" tools:targetApi="l">?attr/colorPrimaryVariant</item>
<!-- Customize your theme here. -->
</style>
</resources>

View File

@ -1,13 +0,0 @@
<?xml version="1.0" encoding="utf-8"?><!--
Sample backup rules file; uncomment and customize as necessary.
See https://developer.android.com/guide/topics/data/autobackup
for details.
Note: This file is ignored for devices older that API 31
See https://developer.android.com/about/versions/12/backup-restore
-->
<full-backup-content>
<!--
<include domain="sharedpref" path="."/>
<exclude domain="sharedpref" path="device.xml"/>
-->
</full-backup-content>

View File

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?><!--
Sample data extraction rules file; uncomment and customize as necessary.
See https://developer.android.com/about/versions/12/backup-restore#xml-changes
for details.
-->
<data-extraction-rules>
<cloud-backup>
<!-- TODO: Use <include> and <exclude> to control what is backed up.
<include .../>
<exclude .../>
-->
</cloud-backup>
<!--
<device-transfer>
<include .../>
<exclude .../>
</device-transfer>
-->
</data-extraction-rules>

View File

@ -1,17 +0,0 @@
package com.zlmediakit.webrtc
import org.junit.Test
import org.junit.Assert.*
/**
* Example local unit test, which will execute on the development machine (host).
*
* See [testing documentation](http://d.android.com/tools/testing).
*/
class ExampleUnitTest {
@Test
fun addition_isCorrect() {
assertEquals(4, 2 + 2)
}
}

View File

@ -1,19 +0,0 @@
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
ext.kotlin_version = '1.7.10'
repositories {
mavenCentral()
}
dependencies {
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}
}
plugins {
id 'com.android.application' version '7.2.1' apply false
id 'com.android.library' version '7.2.1' apply false
id 'org.jetbrains.kotlin.android' version '1.7.10' apply false
}
task clean(type: Delete) {
delete rootProject.buildDir
}

View File

@ -1,23 +0,0 @@
# Project-wide Gradle settings.
# IDE (e.g. Android Studio) users:
# Gradle settings configured through the IDE *will override*
# any settings specified in this file.
# For more details on how to configure your build environment visit
# http://www.gradle.org/docs/current/userguide/build_environment.html
# Specifies the JVM arguments used for the daemon process.
# The setting is particularly useful for tweaking memory settings.
org.gradle.jvmargs=-Xmx2048m -Dfile.encoding=UTF-8
# When configured, Gradle will run in incubating parallel mode.
# This option should only be used with decoupled projects. More details, visit
# http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects
# org.gradle.parallel=true
# AndroidX package structure to make it clearer which packages are bundled with the
# Android operating system, and which are packaged with your app"s APK
# https://developer.android.com/topic/libraries/support-library/androidx-rn
android.useAndroidX=true
# Kotlin code style for this project: "official" or "obsolete":
kotlin.code.style=official
# Enables namespacing of each library's R class so that its R class includes only the
# resources declared in the library itself and none from the library's dependencies,
# thereby reducing the size of the R class for that library
android.nonTransitiveRClass=true

Some files were not shown because too many files have changed in this diff Show More