I’ve been at this for a week now and can’t figure out how to properly place/post a file, I’m using openai version 3.2.1 on both react-native (running via expo) and in nest js backend. On the expo front-end I successfully record a .wav file base64 encoded (confirmed it’s a working file) but any attempt to create a legit file usings buffers/blobs/file object yields an empty file. I also noticed expo doesn’t play well with openai’s library. I switched to nestjs backend sending it the base64 but I’m not able to use the “File” object which is required by openai. Can anyone point out what I’m doing wrong?
could you please provide the blob portion of the code, I feel like my issues around the buffer/blob portion, I just noticed my react-native buffer is empty using Buffer.from but on the node side Buffer.from isn’t
Also, when I try fs.createReadStream in the .createTranscription it gives me Argument of type ‘ReadStream’ is not assignable to parameter of type ‘File’.
it’s like on my react-naitve side its corrupt, and then on the node side I can’t feed it anything but “File”
I don’t think what I did applies to React Native since I am using browser functionalities to record the audio but here it is. I am setting up a MediaRecorder in an effect and using buttons to call mediaRecorder.start() and stop() functions.
I also noticed that here I set the mimeType to webm and on the formData I set it to wav but it doesn’t cause a problem. And probably Chrome is already setting the mimeType of the record to video/webm according to lots of forum posts.
Thanks for all the info, so my nest was complaining that it wasn’t a File object but I just casted it to any fs.createReadStream(path) as any and it worked