Android6.0编程实现双向通话自动录音功能的方法详解

Android6.0编程实现双向通话自动录音功能的方法详解,第1张

概述本文实例讲述了Android6.0编程实现双向通话自动录音功能的方法。分享给大家供大家参考,具体如下:

本文实例讲述了AndroID6.0编程实现双向通话自动录音功能的方法。分享给大家供大家参考,具体如下:

项目中需要实现基于AndroID 6.0 的双向通话自动录音功能,在查阅相关androID电话状态监听文章以及Git上的开源录音项目后,整理出此文

首先,介绍一下androID 电话状态的监听(来电和去电):
https://www.oudahe.com/p/17118/

实现手机电话状态的监听,主要依靠两个类:

TelephonemangerPhonestateListener

Telephonsemanger提供了取得手机基本服务的信息的一种方式。因此应用程序可以使用TelephonyManager来探测手机基本服务的情况。应用程序可以注册Listener来监听电话状态的改变。

我们不能对TelephonyManager进行实例化,只能通过获取服务的形式:

Context.getSystemService(Context.TELEPHONY_SERVICE);

注意:对手机的某些信息进行读取是需要一定许可(permission)的。
主要静态成员常量:(它们对应PhonestateListener.ListEN_CALL_STATE所监听到的内容)

int CALL_STATE_IDLE  //空闲状态,没有任何活动。int CALL_STATE_OFFHOOK //摘机状态,至少有个电话活动。该活动或是拨打(dialing)或是通话,或是 on hold。并且没有电话是ringing or waitingint CALL_STATE_RINGING //来电状态,电话铃声响起的那段时间或正在通话又来新电,新来电话不得不等待的那段时间。

项目中使用服务来监听通话状态,所以需要弄清楚手机通话状态在广播中的对应值:

EXTRA_STATE_IDLE //它在手机通话状态改变的广播中,用于表示CALL_STATE_IDLE状态,即空闲状态。EXTRA_STATE_OFFHOOK //它在手机通话状态改变的广播中,用于表示CALL_STATE_OFFHOOK状态,即摘机状态。EXTRA_STATE_RINGING //它在手机通话状态改变的广播中,用于表示CALL_STATE_RINGING状态,即来电状态ACTION_PHONE_STATE_CHANGED //在广播中用ACTION_PHONE_STATE_CHANGED这个Action来标示通话状态改变的广播(intent)。//注:需要许可READ_PHONE_STATE。String EXTRA_INCOMING_NUMBER //在手机通话状态改变的广播,用于从extra取来电号码。String EXTRA_STATE //在通话状态改变的广播,用于从extra取来通话状态。

如何实现电话监听呢?

AndroID在电话状态改变是会发送action为androID.intent.action.PHONE_STATE的广播,而拨打电话时会发送action为

public static final String ACTION_NEW_OUTGOING_CALL =      "androID.intent.action.NEW_OUTGOING_CALL";

的广播。通过自定义广播接收器,接受上述两个广播便可。

下面给出Java代码:(其中的Toast均为方便测试而添加)

package com.example.hgx.phoneinfo60.Recording;import androID.content.broadcastReceiver;import androID.content.Context;import androID.content.Intent;import androID.telephony.TelephonyManager;import androID.Widget.Toast;/** * Created by hgx on 2016/6/13. */public class Phonecallreceiver extends broadcastReceiver {  private int lastCallState = TelephonyManager.CALL_STATE_IDLE;  private boolean isIncoming = false;  private static String contactNum;  Intent audioRecorderService;  public Phonecallreceiver() {  }  @OverrIDe  public voID onReceive(Context context,Intent intent) {    //如果是去电    if (intent.getAction().equals(Intent.ACTION_NEW_OUTGOING_CALL)){      contactNum = intent.getExtras().getString(Intent.EXTRA_PHONE_NUMBER);    }else //androID.intent.action.PHONE_STATE.查了下androID文档,貌似没有专门用于接收来电的action,所以,非去电即来电.    {      String state = intent.getExtras().getString(TelephonyManager.EXTRA_STATE);      String phoneNumber = intent.getExtras().getString(TelephonyManager.EXTRA_INCOMING_NUMBER);      int stateChange = 0;      if (state.equals(TelephonyManager.EXTRA_STATE_IDLE)){        //空闲状态        stateChange =TelephonyManager.CALL_STATE_IDLE;        if (isIncoming){          onIncomingCallEnded(context,phoneNumber);        }else {          onOutgoingCallEnded(context,phoneNumber);        }      }else if (state.equals(TelephonyManager.EXTRA_STATE_OFFHOOK)){        //摘机状态        stateChange = TelephonyManager.CALL_STATE_OFFHOOK;        if (lastCallState != TelephonyManager.CALL_STATE_RINGING){          //如果最近的状态不是来电响铃的话,意味着本次通话是去电          isIncoming =false;          onOutgoingCallStarted(context,phoneNumber);        }else {          //否则本次通话是来电          isIncoming = true;          onIncomingCallAnswered(context,phoneNumber);        }      }else if (state.equals(TelephonyManager.EXTRA_STATE_RINGING)){        //来电响铃状态        stateChange = TelephonyManager.CALL_STATE_RINGING;        lastCallState = stateChange;        onIncomingcallreceived(context,contactNum);      }    }  }  protected voID onIncomingCallStarted(Context context,String number){    Toast.makeText(context,"Incoming call is started",Toast.LENGTH_LONG).show();    context.startService(new Intent(context,AudioRecorderService.class));  }  protected voID onOutgoingCallStarted(Context context,"Outgoing call is started",AudioRecorderService.class));  }  protected voID onIncomingCallEnded(Context context,"Incoming call is ended",AudioRecorderService.class));  }  protected voID onOutgoingCallEnded(Context context,"Outgoing call is ended",AudioRecorderService.class));  }  protected voID onIncomingcallreceived(Context context,"Incoming call is received",Toast.LENGTH_LONG).show();  }  protected voID onIncomingCallAnswered(Context context,String number) {    Toast.makeText(context,"Incoming call is answered",Toast.LENGTH_LONG).show();  }}

下面是AudioRecorderService的java实现:

package com.example.hgx.phoneinfo60.Recording;import androID.app.Service;import androID.content.Intent;import androID.media.AudioFormat;import androID.media.AudioRecord;import androID.media.MediaRecorder;import androID.os.AsyncTask;import androID.os.Environment;import androID.os.IBinder;import androID.provIDer.MediaStore;import androID.util.Log;import androID.Widget.Toast;import com.example.hgx.phoneinfo60.MyApplication;import java.io.DataOutputStream;import java.io.file;import java.io.fileinputStream;import java.io.fileNotFoundException;import java.io.fileOutputStream;import java.io.IOException;import java.net.httpURLConnection;import java.net.URL;/** * Created by hgx on 2016/6/13. */public class AudioRecorderService extends Service {  private static int RECORD_RATE = 0;  private static int RECORD_BPP = 32;  private static int RECORD_CHANNEL = AudioFormat.CHANNEL_IN_MONO;  private static int RECORD_ENCODER = AudioFormat.ENCoding_PCM_16BIT;  private AudioRecord audioRecorder = null;  private Thread recordT = null;  private Boolean isRecording = false;  private int bufferEle = 1024,bytesPerEle = 2;// want to play 2048 (2K) since 2 bytes we use only 1024 2 bytes in 16bit format  private static int[] recordrate ={44100,22050,11025,8000};  int bufferSize = 0;  file uploadfile;  @OverrIDe  public IBinder onBind(Intent intent) {    // Todo: Return the communication channel to the service.    //maintain the relationship between the caller activity and the callee service,currently useless here    return null;  }  @OverrIDe  public voID onDestroy() {    if (isRecording){      stopRecord();    }else{      Toast.makeText(MyApplication.getContext(),"Recording is already stopped",Toast.LENGTH_SHORT).show();    }    super.onDestroy();  }  @OverrIDe  public int onStartCommand(Intent intent,int flags,int startID) {    if (!isRecording){      startRecord();    }else {      Toast.makeText(MyApplication.getContext(),"Recording is already started",Toast.LENGTH_SHORT).show();    }    return 1;  }  private voID startRecord(){    audioRecorder = initializeRecord();    if (audioRecorder != null){      Toast.makeText(MyApplication.getContext(),"Recording is started",Toast.LENGTH_SHORT).show();      audioRecorder.startRecording();    }else      return;    isRecording = true;    recordT = new Thread(new Runnable() {      @OverrIDe      public voID run() {        writetofile();      }    },"Recording Thread");    recordT.start();  }  private voID writetofile(){    byte bDate[] = new byte[bufferEle];    fileOutputStream fos =null;    file recordfile = createTempfile();    try {      fos = new fileOutputStream(recordfile);    } catch (fileNotFoundException e) {      e.printstacktrace();    }    while (isRecording){      audioRecorder.read(bDate,bufferEle);    }    try {      fos.write(bDate);    } catch (IOException e) {      e.printstacktrace();    }    try {      fos.close();    } catch (IOException e) {      e.printstacktrace();    }  }  //Following function converts short data to byte data  private byte[] writeShortToByte(short[] sData) {    int size = sData.length;    byte[] byteArrayData = new byte[size * 2];    for (int i = 0; i < size; i++) {      byteArrayData[i * 2] = (byte) (sData[i] & 0x00FF);      byteArrayData[(i * 2) + 1] = (byte) (sData[i] >> 8);      sData[i] = 0;    }    return byteArrayData;  }  //Creates temporary .raw file for recording  private file createTempfile() {    file tempfile = new file(Environment.getExternalStorageDirectory(),"aditi.raw");    return tempfile;  }  //Create file to convert to .wav format  private file createWavfile() {    file wavfile = new file(Environment.getExternalStorageDirectory(),"aditi_" + System.currentTimeMillis() + ".wav");    return wavfile;  }  /*   * Convert raw to wav file   * @param java.io.file temporay raw file   * @param java.io.file destination wav file   * @return voID   *   * */  private voID convertRawToWavfile(file tempfile,file wavfile) {    fileinputStream fin = null;    fileOutputStream fos = null;    long audioLength = 0;    long dataLength = audioLength + 36;    long sampleRate = RECORD_RATE;    int channel = 1;    long byterate = RECORD_BPP * RECORD_RATE * channel / 8;    String filename = null;    byte[] data = new byte[bufferSize];    try {      fin = new fileinputStream(tempfile);      fos = new fileOutputStream(wavfile);      audioLength = fin.getChannel().size();      dataLength = audioLength + 36;      createWavefileheader(fos,audioLength,dataLength,sampleRate,channel,byterate);      while (fin.read(data) != -1) {        fos.write(data);      }      uploadfile = wavfile.getabsolutefile();    } catch (fileNotFoundException e) {      //Log.e("MainActivity:convertRawToWavfile",e.getMessage());    } catch (IOException e) {      //Log.e("MainActivity:convertRawToWavfile",e.getMessage());    } catch (Exception e) {      //Log.e("MainActivity:convertRawToWavfile",e.getMessage());    }  }  /*  * To create wav file need to create header for the same  *  * @param java.io.fileOutputStream  * @param long  * @param long  * @param long  * @param int  * @param long  * @return voID  */  private voID createWavefileheader(fileOutputStream fos,long audioLength,long dataLength,long sampleRate,int channel,long byterate) {    byte[] header = new byte[44];    header[0] = 'R'; // RIFF/WAVE header    header[1] = 'I';    header[2] = 'F';    header[3] = 'F';    header[4] = (byte) (dataLength & 0xff);    header[5] = (byte) ((dataLength >> 8) & 0xff);    header[6] = (byte) ((dataLength >> 16) & 0xff);    header[7] = (byte) ((dataLength >> 24) & 0xff);    header[8] = 'W';    header[9] = 'A';    header[10] = 'V';    header[11] = 'E';    header[12] = 'f'; // 'fmt ' chunk    header[13] = 'm';    header[14] = 't';    header[15] = ' ';    header[16] = 16; // 4 bytes: size of 'fmt ' chunk    header[17] = 0;    header[18] = 0;    header[19] = 0;    header[20] = 1; // format = 1    header[21] = 0;    header[22] = (byte) channel;    header[23] = 0;    header[24] = (byte) (sampleRate & 0xff);    header[25] = (byte) ((sampleRate >> 8) & 0xff);    header[26] = (byte) ((sampleRate >> 16) & 0xff);    header[27] = (byte) ((sampleRate >> 24) & 0xff);    header[28] = (byte) (byterate & 0xff);    header[29] = (byte) ((byterate >> 8) & 0xff);    header[30] = (byte) ((byterate >> 16) & 0xff);    header[31] = (byte) ((byterate >> 24) & 0xff);    header[32] = (byte) (2 * 16 / 8); // block align    header[33] = 0;    header[34] = 16; // bits per sample    header[35] = 0;    header[36] = 'd';    header[37] = 'a';    header[38] = 't';    header[39] = 'a';    header[40] = (byte) (audioLength & 0xff);    header[41] = (byte) ((audioLength >> 8) & 0xff);    header[42] = (byte) ((audioLength >> 16) & 0xff);    header[43] = (byte) ((audioLength >> 24) & 0xff);    try {      fos.write(header,44);    } catch (IOException e) {      // Todo auto-generated catch block      //Log.e("MainActivity:createWavfileheader()",e.getMessage());    }  }  /*  * delete created temperory file  * @param  * @return voID  */  private voID deletTempfile() {    file file = createTempfile();    file.delete();  }  /*   * Initialize audio record   *   * @param   * @return androID.media.AudioRecord   */  private AudioRecord initializeRecord() {    short[] audioFormat = new short[]{AudioFormat.ENCoding_PCM_16BIT,AudioFormat.ENCoding_PCM_8BIT};    short[] channelConfiguration = new short[]{AudioFormat.CHANNEL_IN_MONO,AudioFormat.CHANNEL_IN_STEREO};    for (int rate : recordrate) {      for (short aFormat : audioFormat) {        for (short cConf : channelConfiguration) {          //Log.d("MainActivity:initializeRecord()","Rate"+rate+"AudioFormat"+aFormat+"Channel Configuration"+cConf);          try {            int buffSize = AudioRecord.getMinBufferSize(rate,cConf,aFormat);            bufferSize = buffSize;            if (buffSize != AudioRecord.ERROR_BAD_VALUE) {              AudioRecord aRecorder = new AudioRecord(MediaRecorder.AudioSource.DEFAulT,rate,aFormat,buffSize);              if (aRecorder.getState() == AudioRecord.STATE_INITIAliZED) {                RECORD_RATE = rate;                //Log.d("MainActivity:InitializeRecord - AudioFormat",String.valueOf(aFormat));                //Log.d("MainActivity:InitializeRecord - Channel",String.valueOf(cConf));                //Log.d("MainActivity:InitialoizeRecord - rceordrate",String.valueOf(rate));                return aRecorder;              }            }          } catch (Exception e) {            //Log.e("MainActivity:initializeRecord()",e.getMessage());          }        }      }    }    return null;  }  /*  * Method to stop and release audio record  *  * @param  * @return voID  */  private voID stopRecord() {    if (null != audioRecorder) {      isRecording = false;      audioRecorder.stop();      audioRecorder.release();      audioRecorder = null;      recordT = null;      Toast.makeText(getApplicationContext(),"Recording is stopped",Toast.LENGTH_LONG).show();    }    convertRawToWavfile(createTempfile(),createWavfile());    if (uploadfile.exists()) {      //Log.d("AudioRecorderService:stopRecord()","Uploadfile exists");    }    new Uploadfile().execute(uploadfile);    deletTempfile();  }}

更多关于AndroID相关内容感兴趣的读者可查看本站专题:《Android多媒体 *** 作技巧汇总(音频,视频,录音等)》、《Android开发入门与进阶教程》、《Android视图View技巧总结》、《Android编程之activity *** 作技巧总结》、《Android *** 作json格式数据技巧总结》、《Android文件 *** 作技巧汇总》、《Android资源 *** 作技巧汇总》及《Android控件用法总结》

希望本文所述对大家AndroID程序设计有所帮助。

总结

以上是内存溢出为你收集整理的Android6.0编程实现双向通话自动录音功能的方法详解全部内容,希望文章能够帮你解决Android6.0编程实现双向通话自动录音功能的方法详解所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1145657.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-31
下一篇 2022-05-31

发表评论

登录后才能评论

评论列表(0条)

保存