Search code examples
androidaudioaudio-recordingandroid-audiorecord

AudioRecord with Gain Adjustment not working on Samsung Device


I have written code for recording audio file using AudioRecord and while writing file on SD card i am making two version.

Version 1 Recorded file is saved on SD Card as it is.

Version 2 I am applying Gain feature on recorded file and saving on SD card.

This works awesome on Sony Ericson mobiles.Also audio volume is boost to great extent.

But i am struggling to make it work on Samsung Devices.

when i play recorded file it sound like Talking Tom :P

Initially i thought Samusung device did not like the combinations i have used to create AudioRecorder.

So i used following approach in which i loop into available configuration and use best configuration to initialize AudioRecord.

public AudioRecord findAudioRecord() {
    for (int rate: mSampleRates) {
        for (short audioFormat: new short[] {
            AudioFormat.ENCODING_PCM_8BIT, AudioFormat.ENCODING_PCM_16BIT
        }) {
            for (short channelConfig: new short[] {
                AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO
            }) {
                try {
                    Log.i("vipul", "Attempting rate " + rate + "Hz, bits: " + audioFormat + ", channel: " + channelConfig);
                    int bufferSize = AudioRecord.getMinBufferSize(rate, channelConfig, audioFormat);

                    if (bufferSize != AudioRecord.ERROR_BAD_VALUE) {
                        // check if we can instantiate and have a success
                        AudioRecord recorder = new AudioRecord(
                        AudioSource.DEFAULT, rate, channelConfig, audioFormat, bufferSize);

                        if (recorder.getState() == AudioRecord.STATE_INITIALIZED) return recorder;
                    }
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    }
    return null;
}

Below is Code that is working good on Sony mobiles.But Struggling to work on Samsung Devices.

public class EnvironmentRecorder extends Activity implements OnClickListener {

    private static final int RECORDER_BPP = 16;
    private static final String AUDIO_RECORDER_FILE_EXT_WAV = ".wav";
    private static final String AUDIO_RECORDER_FOLDER = "MyRecorder";
    private static final String AUDIO_RECORDER_TEMP_FILE = "record_temp.raw";
    private static final int RECORDER_SAMPLERATE = 44100;
    private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_STEREO;
    private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
    private Button start, stop;
    private AudioRecord recorder = null;
    private int bufferSize = 0;
    private Thread recordingThread = null;
    private boolean isRecording = false;
    private static int[] mSampleRates = new int[] {
        8000, 11025, 22050, 44100
    };

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.main);
        start = (Button) findViewById(R.id.start);
        stop = (Button) findViewById(R.id.stop);
        start.setOnClickListener(this);
        stop.setOnClickListener(this);
    }

    @Override
    public void onClick(View view) {
        switch (view.getId()) {
        case R.id.start:
            startRecord();
            break;
        case R.id.stop:
            stopRecording();
            break;
        }
    }

    public EnvironmentRecorder() {

        try {
            bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
        } catch (Exception e) {
            e.printStackTrace();
        }

    }

    private String getFilename1() {
        String filepath = Environment.getExternalStorageDirectory().getPath();
        File file = new File(filepath, AUDIO_RECORDER_FOLDER);

        if (!file.exists()) {
            file.mkdirs();
        }

        return (file.getAbsolutePath() + "/" + "NotGained" + AUDIO_RECORDER_FILE_EXT_WAV);
    }

    private String getFilename2() {
        String filepath = Environment.getExternalStorageDirectory().getPath();
        File file = new File(filepath, AUDIO_RECORDER_FOLDER);

        if (!file.exists()) {
            file.mkdirs();
        }

        return (file.getAbsolutePath() + "/" + "Gained" + AUDIO_RECORDER_FILE_EXT_WAV);
    }

    private String getTempFilename() {
        String filepath = Environment.getExternalStorageDirectory().getPath();
        File file = new File(filepath, AUDIO_RECORDER_FOLDER);

        if (!file.exists()) {
            file.mkdirs();
        }

        File tempFile = new File(filepath, AUDIO_RECORDER_TEMP_FILE);

        if (tempFile.exists()) tempFile.delete();

        return (file.getAbsolutePath() + "/" + AUDIO_RECORDER_TEMP_FILE);
    }

    public AudioRecord findAudioRecord() {
        for (int rate: mSampleRates) {
            for (short audioFormat: new short[] {
                AudioFormat.ENCODING_PCM_8BIT, AudioFormat.ENCODING_PCM_16BIT
            }) {
                for (short channelConfig: new short[] {
                    AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO
                }) {
                    try {
                        Log.v("vipul", "Attempting rate " + rate + "Hz, bits: " + audioFormat + ", channel: " + channelConfig);
                        int bufferSize = AudioRecord.getMinBufferSize(rate, channelConfig, audioFormat);

                        if (bufferSize != AudioRecord.ERROR_BAD_VALUE) {
                            // check if we can instantiate and have a success
                            AudioRecord recorder = new AudioRecord(
                            AudioSource.DEFAULT, rate, channelConfig, audioFormat, bufferSize);

                            if (recorder.getState() == AudioRecord.STATE_INITIALIZED) return recorder;
                        }
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                }
            }
        }
        return null;
    }

    public void startRecord() {
        /*
         * recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
         * RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING,
         * bufferSize);
         */

        recorder = findAudioRecord();

        recorder.startRecording();

        isRecording = true;

        recordingThread = new Thread(new Runnable() {

            @Override
            public void run() {
                writeAudioDataToFile();
            }
        }, "AudioRecorder Thread");

        recordingThread.start();
    }

    private void writeAudioDataToFile() {
        byte data[] = new byte[bufferSize];
        String filename = getTempFilename();
        FileOutputStream os = null;

        try {
            os = new FileOutputStream(filename);
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        }

        int read = 0;

        if (null != os) {
            while (isRecording) {
                read = recorder.read(data, 0, bufferSize);

                if (AudioRecord.ERROR_INVALID_OPERATION != read) {
                    try {
                        os.write(data);
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }

            try {
                os.close();
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }

    public void stopRecording() {
        if (null != recorder) {
            isRecording = false;

            recorder.stop();
            recorder.release();

            recorder = null;
            recordingThread = null;
            copyWaveFile(getTempFilename(), getFilename1(), getFilename2());
            deleteTempFile();
        }

    }

    private void deleteTempFile() {
        File file = new File(getTempFilename());

        file.delete();
    }

    private void copyWaveFile(String inFilename, String outFileName1, String outFileName2) {
        FileInputStream in = null;
        FileOutputStream out1 = null, out2 = null;
        long totalAudioLen = 0;
        long totalDataLen = totalAudioLen + 36;
        long longSampleRate = RECORDER_SAMPLERATE;
        int channels = 2;
        long byteRate = RECORDER_BPP * RECORDER_SAMPLERATE * channels / 8;

        byte[] data = new byte[bufferSize];

        try { in = new FileInputStream(inFilename);
            out1 = new FileOutputStream(outFileName1);
            out2 = new FileOutputStream(outFileName2);
            totalAudioLen = in .getChannel().size();
            totalDataLen = totalAudioLen + 36;

            WriteWaveFileHeader(out1, totalAudioLen, totalDataLen, longSampleRate, channels, byteRate);
            WriteWaveFileHeader(out2, totalAudioLen, totalDataLen, longSampleRate, channels, byteRate);

            while ( in .read(data) != -1) {

                out1.write(data); // Writing Non-Gained Data

                float rGain = 2.5f;
                for (int i = 0; i < data.length / 2; i++) {

                    short curSample = getShort(data[i * 2], data[i * 2 + 1]);
                    if (rGain != 1) {
                        // apply gain
                        curSample *= rGain;
                        // convert back from short sample that was "gained" to
                        // byte data
                        byte[] a = getByteFromShort(curSample);
                        // modify buffer to contain the gained sample
                        data[i * 2] = a[0];
                        data[i * 2 + 1] = a[1];
                    }

                }

                out2.write(data); // Writing Gained Data
            }
            out1.close();
            out2.close(); in .close();

            Toast.makeText(this, "Done!!", Toast.LENGTH_LONG).show();
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    private short getShort(byte argB1, byte argB2) {
        return (short)((argB1 & 0xff) | (argB2 << 8));

    }

    private byte[] getByteFromShort(short x) {
        // variant 1 - noise
        byte[] a = new byte[2];
        a[0] = (byte)(x & 0xff);
        a[1] = (byte)((x >> 8) & 0xff);

        // variant 2 - noise and almost broke my ears - very loud
        // ByteBuffer buffer = ByteBuffer.allocate(2);
        // buffer.putShort(x);
        // buffer.flip();

        return a;
    }

    private void WriteWaveFileHeader(FileOutputStream out, long totalAudioLen, long totalDataLen, long longSampleRate, int channels, long byteRate)
    throws IOException {

        byte[] header = new byte[44];

        header[0] = 'R';
        header[1] = 'I';
        header[2] = 'F';
        header[3] = 'F';
        header[4] = (byte)(totalDataLen & 0xff);
        header[5] = (byte)((totalDataLen >> 8) & 0xff);
        header[6] = (byte)((totalDataLen >> 16) & 0xff);
        header[7] = (byte)((totalDataLen >> 24) & 0xff);
        header[8] = 'W';
        header[9] = 'A';
        header[10] = 'V';
        header[11] = 'E';
        header[12] = 'f';
        header[13] = 'm';
        header[14] = 't';
        header[15] = ' ';
        header[16] = 16;
        header[17] = 0;
        header[18] = 0;
        header[19] = 0;
        header[20] = 1;
        header[21] = 0;
        header[22] = (byte) channels;
        header[23] = 0;
        header[24] = (byte)(longSampleRate & 0xff);
        header[25] = (byte)((longSampleRate >> 8) & 0xff);
        header[26] = (byte)((longSampleRate >> 16) & 0xff);
        header[27] = (byte)((longSampleRate >> 24) & 0xff);
        header[28] = (byte)(byteRate & 0xff);
        header[29] = (byte)((byteRate >> 8) & 0xff);
        header[30] = (byte)((byteRate >> 16) & 0xff);
        header[31] = (byte)((byteRate >> 24) & 0xff);
        header[32] = (byte)(2 * 16 / 8);
        header[33] = 0;
        header[34] = RECORDER_BPP;
        header[35] = 0;
        header[36] = 'd';
        header[37] = 'a';
        header[38] = 't';
        header[39] = 'a';
        header[40] = (byte)(totalAudioLen & 0xff);
        header[41] = (byte)((totalAudioLen >> 8) & 0xff);
        header[42] = (byte)((totalAudioLen >> 16) & 0xff);
        header[43] = (byte)((totalAudioLen >> 24) & 0xff);

        out.write(header, 0, 44);
    }

}

I would like to know if i need to add any extra loc to make my AudioRecord comfortable with Samsung Devices.


Solution

  • We're also struggling with audio recording on some Samsung Android Devices. Unfortunately it seems to be very broken, as even different revisions of the same phone model are behaving differently with the same codebase.

    Here are my current findings, hoping you find something useful:

    1. Broken Initialization:

    Unfortunately, the strategy you are using to query for valid recording configurations will fail at least on Samsung Galaxy Young and Ace models running Android 2.3 The problem is that some invalid AudioRecord configurations instead of simply failing, will completely brick the audio capture subsystem if tried. You'll need to reset the phone to recover from this state.

    2. Inconsistent Sampling-Rate support along revisions of same phone model

    On an older Galaxy Ace Phone, recording @ 11025Hz, 16-bit mono will succeed. On newer Ace revisions, this AudioRecord configuration will be accepted as valid, but the resulting recording will be distorted, with a "chipmunk" effect. A very popular guitar tuner app that has hardcoded this sampling rate is failing to give proper tuning readings on these phones precisely because of this problem!

    3. Extremely low volume audio capture on some configurations.

    In Galaxy Young and Galaxy Ace, recording from the mic or default audio source @ 44,100Hz (the supposedly canonical rate where everything should work fine) produces an undistorted, but extremely low-volume recording. I haven't found yet a way to fix this other than software amplification (which is the equivalent of magnifying a very low res image, with the consecuent "jageddnes" of the result).

    4. Failure to support the canonical 44,100Hz sampling rate on every audio capture source.

    In Galaxy Young and Galaxy Ace, recording from the Camcorder source fails @ 44,100Hz. (again, the configuration will be accepted as valid) producing complete garbage. However, recording @ 8,000Hz, 16,000Hz and 48,000Hz works fine and produces a recording with very acceptable volume levels. What is frustrating is that according to the Android documentation, 44,100Hz is a sampling rate all devices SHOULD support.

    5. OpenSL does not fix any of the problems reported.

    Working with the NDK and OpenSL produces the same described results. It seems that the AudioRecorder class is simply wrapping calls to OpenSL, and the problem is either hardware based, or buried at a lower-level tier in the kernel code.

    This situation is very unfortunately indeed, as these models are becoming very popular - at least in Mexico.

    Good luck - and please report if you had better luck working with these phones. =)