Android6.0编程实现双向通话自动录音功能的方法详解
本文实例讲述了Android6.0编程实现双向通话自动录音功能的方法。分享给大家供大家参考,具体如下:
项目中需要实现基于Android6.0的双向通话自动录音功能,在查阅相关android电话状态监听文章以及Git上的开源录音项目后,整理出此文
首先,介绍一下android电话状态的监听(来电和去电):
https://www.nhooo.com/article/32433.htm
实现手机电话状态的监听,主要依靠两个类:
TelephoneManger和PhoneStateListener
TelephonseManger提供了取得手机基本服务的信息的一种方式。因此应用程序可以使用TelephonyManager来探测手机基本服务的情况。应用程序可以注册listener来监听电话状态的改变。
我们不能对TelephonyManager进行实例化,只能通过获取服务的形式:
Context.getSystemService(Context.TELEPHONY_SERVICE);
注意:对手机的某些信息进行读取是需要一定许可(permission)的。
主要静态成员常量:(它们对应PhoneStateListener.LISTEN_CALL_STATE所监听到的内容)
intCALL_STATE_IDLE//空闲状态,没有任何活动。 intCALL_STATE_OFFHOOK//摘机状态,至少有个电话活动。该活动或是拨打(dialing)或是通话,或是onhold。并且没有电话是ringingorwaiting intCALL_STATE_RINGING//来电状态,电话铃声响起的那段时间或正在通话又来新电,新来电话不得不等待的那段时间。
项目中使用服务来监听通话状态,所以需要弄清楚手机通话状态在广播中的对应值:
EXTRA_STATE_IDLE//它在手机通话状态改变的广播中,用于表示CALL_STATE_IDLE状态,即空闲状态。 EXTRA_STATE_OFFHOOK//它在手机通话状态改变的广播中,用于表示CALL_STATE_OFFHOOK状态,即摘机状态。 EXTRA_STATE_RINGING//它在手机通话状态改变的广播中,用于表示CALL_STATE_RINGING状态,即来电状态 ACTION_PHONE_STATE_CHANGED//在广播中用ACTION_PHONE_STATE_CHANGED这个Action来标示通话状态改变的广播(intent)。 //注:需要许可READ_PHONE_STATE。 StringEXTRA_INCOMING_NUMBER//在手机通话状态改变的广播,用于从extra取来电号码。 StringEXTRA_STATE//在通话状态改变的广播,用于从extra取来通话状态。
如何实现电话监听呢?
Android在电话状态改变是会发送action为android.intent.action.PHONE_STATE的广播,而拨打电话时会发送action为
publicstaticfinalStringACTION_NEW_OUTGOING_CALL= "android.intent.action.NEW_OUTGOING_CALL";
的广播。通过自定义广播接收器,接受上述两个广播便可。
下面给出Java代码:(其中的Toast均为方便测试而添加)
packagecom.example.hgx.phoneinfo60.Recording; importandroid.content.BroadcastReceiver; importandroid.content.Context; importandroid.content.Intent; importandroid.telephony.TelephonyManager; importandroid.widget.Toast; /** *Createdbyhgxon2016/6/13. */ publicclassPhoneCallReceiverextendsBroadcastReceiver{ privateintlastCallState=TelephonyManager.CALL_STATE_IDLE; privatebooleanisIncoming=false; privatestaticStringcontactNum; IntentaudioRecorderService; publicPhoneCallReceiver(){ } @Override publicvoidonReceive(Contextcontext,Intentintent){ //如果是去电 if(intent.getAction().equals(Intent.ACTION_NEW_OUTGOING_CALL)){ contactNum=intent.getExtras().getString(Intent.EXTRA_PHONE_NUMBER); }else//android.intent.action.PHONE_STATE.查了下android文档,貌似没有专门用于接收来电的action,所以,非去电即来电. { Stringstate=intent.getExtras().getString(TelephonyManager.EXTRA_STATE); StringphoneNumber=intent.getExtras().getString(TelephonyManager.EXTRA_INCOMING_NUMBER); intstateChange=0; if(state.equals(TelephonyManager.EXTRA_STATE_IDLE)){ //空闲状态 stateChange=TelephonyManager.CALL_STATE_IDLE; if(isIncoming){ onIncomingCallEnded(context,phoneNumber); }else{ onOutgoingCallEnded(context,phoneNumber); } }elseif(state.equals(TelephonyManager.EXTRA_STATE_OFFHOOK)){ //摘机状态 stateChange=TelephonyManager.CALL_STATE_OFFHOOK; if(lastCallState!=TelephonyManager.CALL_STATE_RINGING){ //如果最近的状态不是来电响铃的话,意味着本次通话是去电 isIncoming=false; onOutgoingCallStarted(context,phoneNumber); }else{ //否则本次通话是来电 isIncoming=true; onIncomingCallAnswered(context,phoneNumber); } }elseif(state.equals(TelephonyManager.EXTRA_STATE_RINGING)){ //来电响铃状态 stateChange=TelephonyManager.CALL_STATE_RINGING; lastCallState=stateChange; onIncomingCallReceived(context,contactNum); } } } protectedvoidonIncomingCallStarted(Contextcontext,Stringnumber){ Toast.makeText(context,"Incomingcallisstarted",Toast.LENGTH_LONG).show(); context.startService(newIntent(context,AudioRecorderService.class)); } protectedvoidonOutgoingCallStarted(Contextcontext,Stringnumber){ Toast.makeText(context,"Outgoingcallisstarted",Toast.LENGTH_LONG).show(); context.startService(newIntent(context,AudioRecorderService.class)); } protectedvoidonIncomingCallEnded(Contextcontext,Stringnumber){ Toast.makeText(context,"Incomingcallisended",Toast.LENGTH_LONG).show(); context.startService(newIntent(context,AudioRecorderService.class)); } protectedvoidonOutgoingCallEnded(Contextcontext,Stringnumber){ Toast.makeText(context,"Outgoingcallisended",Toast.LENGTH_LONG).show(); context.startService(newIntent(context,AudioRecorderService.class)); } protectedvoidonIncomingCallReceived(Contextcontext,Stringnumber){ Toast.makeText(context,"Incomingcallisreceived",Toast.LENGTH_LONG).show(); } protectedvoidonIncomingCallAnswered(Contextcontext,Stringnumber){ Toast.makeText(context,"Incomingcallisanswered",Toast.LENGTH_LONG).show(); } }
下面是AudioRecorderService的java实现:
packagecom.example.hgx.phoneinfo60.Recording; importandroid.app.Service; importandroid.content.Intent; importandroid.media.AudioFormat; importandroid.media.AudioRecord; importandroid.media.MediaRecorder; importandroid.os.AsyncTask; importandroid.os.Environment; importandroid.os.IBinder; importandroid.provider.MediaStore; importandroid.util.Log; importandroid.widget.Toast; importcom.example.hgx.phoneinfo60.MyApplication; importjava.io.DataOutputStream; importjava.io.File; importjava.io.FileInputStream; importjava.io.FileNotFoundException; importjava.io.FileOutputStream; importjava.io.IOException; importjava.net.HttpURLConnection; importjava.net.URL; /** *Createdbyhgxon2016/6/13. */ publicclassAudioRecorderServiceextendsService{ privatestaticintRECORD_RATE=0; privatestaticintRECORD_BPP=32; privatestaticintRECORD_CHANNEL=AudioFormat.CHANNEL_IN_MONO; privatestaticintRECORD_ENCODER=AudioFormat.ENCODING_PCM_16BIT; privateAudioRecordaudioRecorder=null; privateThreadrecordT=null; privateBooleanisRecording=false; privateintbufferEle=1024,bytesPerEle=2;//wanttoplay2048(2K)since2bytesweuseonly10242bytesin16bitformat privatestaticint[]recordRate={44100,22050,11025,8000}; intbufferSize=0; FileuploadFile; @Override publicIBinderonBind(Intentintent){ //TODO:Returnthecommunicationchanneltotheservice. //maintaintherelationshipbetweenthecalleractivityandthecalleeservice,currentlyuselesshere returnnull; } @Override publicvoidonDestroy(){ if(isRecording){ stopRecord(); }else{ Toast.makeText(MyApplication.getContext(),"Recordingisalreadystopped",Toast.LENGTH_SHORT).show(); } super.onDestroy(); } @Override publicintonStartCommand(Intentintent,intflags,intstartId){ if(!isRecording){ startRecord(); }else{ Toast.makeText(MyApplication.getContext(),"Recordingisalreadystarted",Toast.LENGTH_SHORT).show(); } return1; } privatevoidstartRecord(){ audioRecorder=initializeRecord(); if(audioRecorder!=null){ Toast.makeText(MyApplication.getContext(),"Recordingisstarted",Toast.LENGTH_SHORT).show(); audioRecorder.startRecording(); }else return; isRecording=true; recordT=newThread(newRunnable(){ @Override publicvoidrun(){ writeToFile(); } },"RecordingThread"); recordT.start(); } privatevoidwriteToFile(){ bytebDate[]=newbyte[bufferEle]; FileOutputStreamfos=null; FilerecordFile=createTempFile(); try{ fos=newFileOutputStream(recordFile); }catch(FileNotFoundExceptione){ e.printStackTrace(); } while(isRecording){ audioRecorder.read(bDate,0,bufferEle); } try{ fos.write(bDate); }catch(IOExceptione){ e.printStackTrace(); } try{ fos.close(); }catch(IOExceptione){ e.printStackTrace(); } } //Followingfunctionconvertsshortdatatobytedata privatebyte[]writeShortToByte(short[]sData){ intsize=sData.length; byte[]byteArrayData=newbyte[size*2]; for(inti=0;i>8); sData[i]=0; } returnbyteArrayData; } //Createstemporary.rawfileforrecording privateFilecreateTempFile(){ FiletempFile=newFile(Environment.getExternalStorageDirectory(),"aditi.raw"); returntempFile; } //Createfiletoconvertto.wavformat privateFilecreateWavFile(){ FilewavFile=newFile(Environment.getExternalStorageDirectory(),"aditi_"+System.currentTimeMillis()+".wav"); returnwavFile; } /* *Convertrawtowavfile *@paramjava.io.Filetemporayrawfile *@paramjava.io.Filedestinationwavfile *@returnvoid * **/ privatevoidconvertRawToWavFile(FiletempFile,FilewavFile){ FileInputStreamfin=null; FileOutputStreamfos=null; longaudioLength=0; longdataLength=audioLength+36; longsampleRate=RECORD_RATE; intchannel=1; longbyteRate=RECORD_BPP*RECORD_RATE*channel/8; StringfileName=null; byte[]data=newbyte[bufferSize]; try{ fin=newFileInputStream(tempFile); fos=newFileOutputStream(wavFile); audioLength=fin.getChannel().size(); dataLength=audioLength+36; createWaveFileHeader(fos,audioLength,dataLength,sampleRate,channel,byteRate); while(fin.read(data)!=-1){ fos.write(data); } uploadFile=wavFile.getAbsoluteFile(); }catch(FileNotFoundExceptione){ //Log.e("MainActivity:convertRawToWavFile",e.getMessage()); }catch(IOExceptione){ //Log.e("MainActivity:convertRawToWavFile",e.getMessage()); }catch(Exceptione){ //Log.e("MainActivity:convertRawToWavFile",e.getMessage()); } } /* *Tocreatewavfileneedtocreateheaderforthesame * *@paramjava.io.FileOutputStream *@paramlong *@paramlong *@paramlong *@paramint *@paramlong *@returnvoid */ privatevoidcreateWaveFileHeader(FileOutputStreamfos,longaudioLength,longdataLength,longsampleRate,intchannel,longbyteRate){ byte[]header=newbyte[44]; header[0]='R';//RIFF/WAVEheader header[1]='I'; header[2]='F'; header[3]='F'; header[4]=(byte)(dataLength&0xff); header[5]=(byte)((dataLength>>8)&0xff); header[6]=(byte)((dataLength>>16)&0xff); header[7]=(byte)((dataLength>>24)&0xff); header[8]='W'; header[9]='A'; header[10]='V'; header[11]='E'; header[12]='f';//'fmt'chunk header[13]='m'; header[14]='t'; header[15]=''; header[16]=16;//4bytes:sizeof'fmt'chunk header[17]=0; header[18]=0; header[19]=0; header[20]=1;//format=1 header[21]=0; header[22]=(byte)channel; header[23]=0; header[24]=(byte)(sampleRate&0xff); header[25]=(byte)((sampleRate>>8)&0xff); header[26]=(byte)((sampleRate>>16)&0xff); header[27]=(byte)((sampleRate>>24)&0xff); header[28]=(byte)(byteRate&0xff); header[29]=(byte)((byteRate>>8)&0xff); header[30]=(byte)((byteRate>>16)&0xff); header[31]=(byte)((byteRate>>24)&0xff); header[32]=(byte)(2*16/8);//blockalign header[33]=0; header[34]=16;//bitspersample header[35]=0; header[36]='d'; header[37]='a'; header[38]='t'; header[39]='a'; header[40]=(byte)(audioLength&0xff); header[41]=(byte)((audioLength>>8)&0xff); header[42]=(byte)((audioLength>>16)&0xff); header[43]=(byte)((audioLength>>24)&0xff); try{ fos.write(header,0,44); }catch(IOExceptione){ //TODOAuto-generatedcatchblock //Log.e("MainActivity:createWavFileHeader()",e.getMessage()); } } /* *deletecreatedtemperoryfile *@param *@returnvoid */ privatevoiddeletTempFile(){ Filefile=createTempFile(); file.delete(); } /* *Initializeaudiorecord * *@param *@returnandroid.media.AudioRecord */ privateAudioRecordinitializeRecord(){ short[]audioFormat=newshort[]{AudioFormat.ENCODING_PCM_16BIT,AudioFormat.ENCODING_PCM_8BIT}; short[]channelConfiguration=newshort[]{AudioFormat.CHANNEL_IN_MONO,AudioFormat.CHANNEL_IN_STEREO}; for(intrate:recordRate){ for(shortaFormat:audioFormat){ for(shortcConf:channelConfiguration){ //Log.d("MainActivity:initializeRecord()","Rate"+rate+"AudioFormat"+aFormat+"ChannelConfiguration"+cConf); try{ intbuffSize=AudioRecord.getMinBufferSize(rate,cConf,aFormat); bufferSize=buffSize; if(buffSize!=AudioRecord.ERROR_BAD_VALUE){ AudioRecordaRecorder=newAudioRecord(MediaRecorder.AudioSource.DEFAULT,rate,cConf,aFormat,buffSize); if(aRecorder.getState()==AudioRecord.STATE_INITIALIZED){ RECORD_RATE=rate; //Log.d("MainActivity:InitializeRecord-AudioFormat",String.valueOf(aFormat)); //Log.d("MainActivity:InitializeRecord-Channel",String.valueOf(cConf)); //Log.d("MainActivity:InitialoizeRecord-rceordRate",String.valueOf(rate)); returnaRecorder; } } }catch(Exceptione){ //Log.e("MainActivity:initializeRecord()",e.getMessage()); } } } } returnnull; } /* *Methodtostopandreleaseaudiorecord * *@param *@returnvoid */ privatevoidstopRecord(){ if(null!=audioRecorder){ isRecording=false; audioRecorder.stop(); audioRecorder.release(); audioRecorder=null; recordT=null; Toast.makeText(getApplicationContext(),"Recordingisstopped",Toast.LENGTH_LONG).show(); } convertRawToWavFile(createTempFile(),createWavFile()); if(uploadFile.exists()){ //Log.d("AudioRecorderService:stopRecord()","UploadFileexists"); } newUploadFile().execute(uploadFile); deletTempFile(); } }
更多关于Android相关内容感兴趣的读者可查看本站专题:《Android多媒体操作技巧汇总(音频,视频,录音等)》、《Android开发入门与进阶教程》、《Android视图View技巧总结》、《Android编程之activity操作技巧总结》、《Android操作json格式数据技巧总结》、《Android文件操作技巧汇总》、《Android资源操作技巧汇总》及《Android控件用法总结》
希望本文所述对大家Android程序设计有所帮助。