Problem With Streaming Data From Two Devices

I have a listener function in my code where I'm listening for notifications from a peripheral. Here's the relevant part:

let mut notification_stream = peripheral_struct.peripheral.notifications().await.map_err(|e| e.to_string())?;
while let Some(data) = notification_stream.next().await {
    // ... processing data ...
}

Initially, the first peripheral connects and streams data at the expected speed. However, when a second peripheral connects and starts streaming, the streaming speed of the first peripheral significantly slows down. I'm tracking this issue by printing the received packets and their durations, which clearly shows the reduction in streaming speed.

```#[tauri::command]
async fn listen_for_changes(window: Window,peripheral_struct: &PeripheralStruct) -> Result<(), String> {
    let read_characteristic = match &peripheral_struct.read_uart_tx_characteristic {
        Some(characteristic) => characteristic,
        None => return Err("Read characteristic not found".to_string()),
    };

    let mut packet_counter = 0;
    let mut first_packet_time = None;
    let mut last_packet_time = None;
    let channel = match peripheral_struct.pod_side {
        PodSide::Left => "processed-data-left",
        PodSide::Right => "processed-data-right",
    };
    
    if read_characteristic.properties.contains(CharPropFlags::NOTIFY) {
        println!("Subscribed to notifications. Waiting for messages...");
        let mut notification_stream = peripheral_struct.peripheral.notifications().await.map_err(|e| e.to_string())?;
        while let Some(data) = notification_stream.next().await {
            if data.value.len() > 20 {
                if packet_counter == 0 {
                    first_packet_time = Some(Instant::now());
                }

                let processed_messages = helper::process_messages(&data.value,peripheral_struct.pod_side.clone());
                packet_counter += 4;                  
                for message in processed_messages {
                    window.emit(channel, Some(&message))
                        .map_err(|e| e.to_string())?;
                }  

                    if packet_counter % 250 == 0 {
                        last_packet_time = Some(Instant::now());
                        println!("Received packets {}, counter disconnecting...", packet_counter);
                        if let (Some(start), Some(end)) = (first_packet_time, last_packet_time) {
                            println!("Duration: {:?}", end.duration_since(start));
                        }
         }          
            } else {
                println!("Received data < 20 from {:?} [{:?}]: {:?}", peripheral_struct.name, data.uuid, data.value);
            }
            
        }
        println!("Exited notification loop.");
    } else {
        println!("Characteristic does not support notifications.");
    }
    Ok(())
}

Both peripherals are identical and function well with an iOS app written in SwiftUI. I'm concerned that I might be making a mistake in the way I'm listening to them.

I have tried to remove processing logic by taking out the "let processed_messages = helper::process_messages(&data.value,peripheral_struct.pod_side.clone()); " this part of the code and did no effect.. What is the best way for this type of streaming simultaneously?

async fn manage_concurrent_streams(
    window: Window,
    peripheral: PeripheralStruct, // Take ownership
) -> Result<(), String> {
    match peripheral.pod_side {
        PodSide::Left => {
            let peripheral_clone = peripheral.clone(); // Clone if possible
            tokio::spawn(async move {
                listen_for_changes(window.clone(), &peripheral_clone).await
            });
        },
        PodSide::Right => {
            tokio::spawn(async move {
                listen_for_changes(window, &peripheral).await
            });
        }
    }

    Ok(())
}```


 here is how I call the function after peripheral connection

Thanks

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.