I need to build a web application that uses WebRTC to get web camera video stream and mic audio stream and immediatelly translate it to the server for further broadcasting to multiple clients. The app must do it in real time in full duplex. I mean it would be a kind of live video chat. It would be some sort of educational app. So the question is: is it possible now? What technologies should I use? Should I use WebRTC with WebSocket and Node.js on the backend? Or can I use php instead of node? Can I use Socket.io for that? Is there any other ways to achieve this? May be flash?
The PeerConnection API in WebRTC does not require a back-end server for conducting one or more connections between peers.
The only thing you need a back-end server for is to serve as a mediator for first establishing the connections between the peers. To that end, you can use the WebSocket API, Ajax, or any other means necessary to achieve that. Also, yes, you can use PHP to write the server-side for WebSocket (or whatever method you want to use to establish the peer-to-peer connection). It's really up to you.
At the moment, only Chrome and Firefox support enough of the WebRTC APIs to make video chatting a possibility. Very soon though, Opera will likely join the mix, but no one's sure yet whether WebRTC will be added to IE11 or not, and Apple seems to have no intention of adding WebRTC to Safari any time soon (because they have their own proprietary technology for that; sound familiar?!).
Anyway, WebRTC is your best bet. As an added note, I don't think it's possible to use JS to send video and audio to a server, and then have the server forward that data to the other peer(s). Instead, you need to use WebRTC to establish peer-to-peer connections, and then go from there.
Edit: If you use a TURN server, you can reroute your audio and video data through a server, but that's actually the least ideal situation, and you can still only do that if you're using the WebRTC APIs.