How to deal with response? (web site)

I You get angry for low level of the question, excuse me, please. It wail be last time.

I tried to parse using scraper, I tried to use something like Response.Substring(....)

No discussions in internet forums. I cannot find
All Google links lead to docs.rust.com..
In Cookbook there is not my situation. (it is not unique, it is very very simple. extra simple).

Compiler does not cooperate, instead of it: No! No! error! No! error: expected expression, found ? expected struct String, found &String error[E0308]: mismatched types, etc. etc

Ok.. But what to do? It does not matter for me string is it, option, str. or.. I agree to everything! I just want to extract a mini pies of small text

It seems, a little task for other langs is big challenge for rust

Here is my 2 day suffering:

[dependencies]
local-ip-address = "0.4.4" 
chrono = "0.4" 
reqwest =  "0.11"
tokio       =   {  version =   "1" ,   features    =   [   "full"  ]  }   
scraper = "0.13.0" 
let   mut my_ip     =  String::from("")    ;
my__ip  ( & my_ip )   ;

////////////////////////////////////////////////////////
// use scraper::{Html, Selector};
// use reqwest ::   Client ;
#   [  tokio    ::  main   ] 
async    fn   my__ip   /* main */    ( mut my_ip : & str  )   -> Result< (   )   , reqwest   ::  Error   > {

    let my_rsp: reqwest::Response = reqwest    ::  get (   "http://checkip.dyndns.com")   .   await   ?   ;    // println !   (   "rsp  {:#?}  "  , my_rsp   ) ;
  
   println !   (   "\nrsp   {:#?}  "  ,  my_rsp  ) ;
   
   
   
   
   // "<html><head><title>Current IP Check</title></head><body>Current IP Address: 143.244.44.173</body></html>\r\n"
   let my_htm: String =& my_rsp                  .   text()  .   await   ?   ;
    
// my_ip=     &  my_htm [      5       ..     22     ]      ;
//  println !   (   "\nhtm   {:#?}  "  ,             my_htm .to_string() . my_htm .to_string().len()   -11      )        ) ;
//    println !   (   "\nip1   {:#?}  "  , my_ip  ) ;
   println !   (   "\nhtm  {:#?}  "  ,  my_htm  ) ;

   let my_doc = scraper::Html::parse_document(&my_htm);
println !   (   "\ndoc? {:?}  "  ,  & my_doc    ) ;
println !   (   "\ndoc# {:#?}  "  ,    my_doc    ) ;


let my_fmt = scraper::Html::parse_fragment( & my_htm );
println !   (   "\nfmt? {:?}  "  ,    my_fmt    ) ;
println !   (   "\nfmt# {:#?}  "  ,    my_fmt    ) ;
let my_sel = scraper::Selector::parse("body").unwrap();
println !   (   "\nsel?    {:?}  "  ,    my_sel    ) ;
println !   (   "sel#    {:#?}  "  ,    my_sel    ) ;
println !   (   "selen   {:#?}  \n"  ,      my_sel.selectors.len()    ) ;


for element in my_fmt .select(& my_sel) {
  println !   (   "\nel? {:?}  "  ,  element  .text()    ) ;
   println !   (   "elvl {:?}  "  ,  element   .value()   ) ;
   println !   (   "elnm {:?}  "  ,  element .value().name() .to_string()    ) ;
    //   assert_eq!("body", element.value().name());
 
}
//  let mut my_elm ="" ;
for element in my_doc.tree {
  println !   (   "tr?  {:?}   "  ,  element      ) ;
  if element.is_text() // && element.as_text().contains("Current IP Address:")
{
//    my_elm =   & Option::Some(()).clone() ;
       println !   (   "trtxt {:?}   "  ,  element  .as_text()   ) ;     }
//    println !   (   "\nslc2. {:?}  "  ,  element   .value()   ) ;
//    println !   (   "\nslc3. {:?}  "  ,  element .value().name()     ) ;
    //   assert_eq!("body", element.value().name());
}
 

return Ok(())  ;
}
////////////////////////////////////////////////////////

Here's one of the errors mentioned in your post:

error[E0308]: mismatched types
  --> src/main.rs:15:30
   |
15 |         let my_htm: String = &my_rsp.text().await?;
   |                     ------   ^^^^^^^^^^^^^^^^^^^^^ expected struct `String`, found `&String`
   |                     |
   |                     expected due to this
   |
note: return type inferred to be `String` here
  --> src/main.rs:10:41
   |
10 |         let my_rsp: reqwest::Response = reqwest::get("http://checkip.dyndns.com").await?; // println !   (   "rsp  {:#?}  "  , my_rsp   ) ;
   |                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
help: consider removing the borrow
   |
15 -         let my_htm: String = &my_rsp.text().await?;
15 +         let my_htm: String = my_rsp.text().await?;
   | 

Notice that under the error message, it shows you what you need to do to fix the issue - just remove the &.

If you didn't see any of the info I posted above, you may be looking at the error messages via your editor, rather than the command line. This is useful for quick feedback, but if you ever get stuck trying to figure out why an error is occurring, your first step should always be to run cargo check on the command line, as it will usually give you much more detailed info (or even just tell you how the fix the issue).

I couldn't replicate the error you mentioned about 'expected expression, found ?' on the playground, but that may be due to the scraper crate not being available there. If you post the cargo check error you get for that, people may be able to help more :slight_smile:

2 Likes

Sorry, the code you posted is so badly formatted I can't read any of it, even after trying to format it properly there are still errors.

If you want to get your WAN IP address, here's a way to to it with the same dependencies:

use std::net::IpAddr;

async fn wan_ip() -> Result<String, reqwest::Error> {
    reqwest::get("https://api64.ipify.org/?format=text")
        .await?
        .text()
        .await
}

#[tokio::main]
async fn main() {
    let ip = wan_ip().await.unwrap();
    let address: IpAddr = ip.parse().unwrap();
    println!("{:?}", address);
}

Please take a look at the formatting guidelines of the forum, additionnally you can use cargo fmt to format your code in a standard way before sharing it.

3 Likes

These days, it would likely be best to use api64.ipify.org, which supports both IPv4 and IPv6. Ideally, our programs should be compatible with IPv6-only networks, even though adoption is currently low.

2 Likes

Yes, probably, and since IpAddr is an enum of V4 et V6, it's just a matter of changing the url :wink:

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.