I am new to rust and seek your suggestion of possible crates to choose from to implement the following scenario:
- We have two machines
1.1 - Normal Desktop - connected to intenet,
1.2 - RT(real-time) machine - not connected to internet.
- Both of the machines above are connected with each other over TCP/IP with certain ip:port entry.
- To communicate with these two machines , we used to use WCF(windows communication foundation) with requsest-reply pattern or duplex pattern . Now we are looking forward to make it platform independent and considering gRPC as a better option . I assume that Tonic within Tokio is stable crate to consider here.
During this remote procedural calls between these two machines, I would like to cast all of the relevant information to a website where an authenticated user can see it in runtime/live. These relevant information are as follows:
- Request sent by the RT machine to normal desktop.
- Confirmation that the request is recieved by the normal desktop.
- Response sent by the normal desktop.
These information above are in *.json format and expected to be rendered in run-time and casted live to a user. Is this casting part can also achievable with the same crate - Tokio or some better options available ?
I haven't understood the second part (who hosts your web server, the desktop? What messages are JSON?), but if you want to use gRPC, tonic would be a great choice. An alternative to gRPC and protobuf I haven't tried yet but looks very interesting is Cap'n Proto, which has Rust bindings.
ouch , did not think of hosting the webserver for data steam casting, would hosting on the cloud be a better option ?
I can't answer that as I haven't fully understood your setup. You have some device connected to a desktop and you want users to see the data the device produces in real time in some sort of dashboard in a browser, right? What machine will the user use? Is it the desktop itself? Or some other machine in the local network? Or do the users connect to the dashboard through the internet?
The desktop is only connected to internet, there is not other device connected to it other than the RT machine.
The authenticated user get to see all data request/response from/to RT machine. So RT machine is the client who sent request to the desktop and the desktop responding to the request is the server. Other than serving the response to the client , it also have to serve both request and response data to dashboard . It is also my question on how to send data to dashboard in real-time . So far I am only assuming that the dashboard is an web interface and the user authenticate itself and consume dashboard over internet.
Could you host the web-server on your desktop? I.e. maybe you have a static IP or can use DDNS to access the desktop directly from the internet. Then you could serve both the RT device and your users using the same program, or at least the same machine running two processes. Then you wouldn't have to add a cloud instance somewhere which could be pricey and adds latency to your requests.
during the discussion , it came to mind that the data that we cast on the dashboad, we want to store them as well on cloud. In that case how much more do we need to adapt more to your previous suggestion ?
I don't know the data and I can't give any recommendations, but data warehousing and real-time processing with the possibility to create dashboards on top of that is pretty ubiquitous across all mayor cloud distributors. GCP has BigQuery, Dataflow and Cloud Monitoring. AWS has Redshift which you could use with Grafana. I like Elasticsearch with Kibana. The list goes on and on.