How do I initialize this global variable safely?

The global variable's type does not implement Sync, so the normal solutions to global variables do not work. Said type is JNIEnv from the jni crate.

I am aware that having a not up-to-date copy of a JNI environment could lead to broken behavior in many cases, but I am only accessing one class that is guaranteed to never change, only ever calling functions within it, meaning it should be safe to have a reference to it.

The reasoning behind needing a global copy of the JNI environment is because I need a Lua VM from the mlua crate to be able to call functions within Java, critically, without being able to access the JNI environment, as the Lua code being run is user-specified, meaning that would be a massive security hole. I cannot sandbox the Lua code if it has access to the JNI environment.

If someone could think of a better way of doing this, I would be grateful, as I really don't like this solution. But I cannot think of any other way this could be accomplished.

Here's the code that I'm talking about (link):

// [...]

static mut JNI_ENV: Option<JNIEnv> = Option::None;

// Function for Lua to call to draw a pixel on the screen
unsafe fn draw_pix(x: u32, y: u32)
{
    let dllhandler: JClass;
    match JNI_ENV.expect("JNI_ENV is None").find_class("bluefire_06/mimosa_computers/lua/DLLHandler") {
        Ok(class) => dllhandler = class,
        Err(e) => {println!("{}", e.to_string()); dllhandler = JClass::default()}
    }

    let args: [JValue; 2] = [JValue::Int(x.try_into().expect("X value is out of range")), JValue::Int(y.try_into().expect("Y value is out of range"))];

    match JNI_ENV.expect("JNI_ENV is None").call_static_method(dllhandler, "drawPix", "(II)V", &args) {
        Ok(_) => (),
        Err(e) => println!("{}", e.to_string())
    }
}

// [...]

#[no_mangle]
pub extern "system" fn Java_bluefire_106_mimosa_1computers_lua_DLLHandler_initLuaDLL(env: JNIEnv, _class: JClass)
{
    // I know this isn't valid
    let envCopy = env.clone();
    unsafe {
        JNI_ENV = Some(envCopy);
    }

    // [...]
}

// [...]

Also, I know this can't be run in the playground, but that's basically impossible to do even if I included any more code, as you can't create a JNIEnv without actually getting Java code to run the native code.

You should not put JNIEnv to the global variable. It's only guaranteed to be valid during the function call the JVM performs. If you don't want to keep passing it explicitly via function params, you can use things like scoped-tls crate to pass it through the scope implicitly.

3 Likes

Just so you know, your second solution is the only valid one you provided, as the first is not a good idea, because the function I would need to pass the JNIEnv into would be called by Lua code which would be specified by the user and run in a sandbox environment on a server. You can probably see why user-specified code having access to the JNI environment would be a massive security problem.

Edit: Wait, none of your solutions work. Unless I'm missing something really obvious, I cannot access the JNIEnv whatsoever outside of the function I get it from. This is really annoying, as this basically means I have to use C++. I have used this exact thing in C++ before. (As in, making a copy of the JNIEnv so I can still access Java functions elsewhere) It never caused errors, as I was careful to only use the JNIEnv for calling functions that I know will never be changed.

Also, I'm not sure you even read why I needed to do this anyways, as you would've realized that both your solutions don't work for what I'm doing.

No, they're right. JNIEnv<'a> is only valid for the 'a lifetime. What you're talking about doing in C++ is basically the same as getting the underlying raw pointer and converting it back into a JNIEnv later, disregarding the lifetime entirely.

let raw_ptr = env.get_native_interface();

// SAFETY: All bets are off!
let env = unsafe { JNIEnv::from_raw(raw_ptr).unwrap() };

I read it, and I still find it suspicious. If all you have is an API of free functions without any VM context, you should probably reach for thread-local storage over global state. If it were my codebase, I would personally focus on passing context through function arguments.

You currently keep the lua VM alive in its stack frame with an unconditional infinite loop, so it seems reasonable to assume the JNIEnv on this stack frame is also going to remain accessible for the same lifetime. The closures in the registered functions should be able to borrow it without relying on global state and without exposing it to the code running in the VM at all.

2 Likes

Oh. Why did I not think of that? I was going to have it in multiple threads anyways. (so more than one computer can run at once without blocking others) Also, what do you mean by "an API of free functions without any VM state"?

I was referring to the "free" function draw_pix, as opposed to a struct method which takes &self. Which in turn can provide information to the function that is contextually relevant. Such as which VM (lua or JVM, for that matter) is calling the function. Don't sweat the details. It's just a way of saying encapsulation is your friend.

I don't know if this is because I don't know how to use thread_local!, or something else, but for some reason, I cannot write to (but I can read from) the JNI_ENV variable because it can't be transferred between threads. What? Isn't the entire point of thread_local! that it doesn't get transferred between threads? Anyways, here's my code and the error:

// [...]

thread_local!(static JNI_ENV: RefCell<Option<JNIEnv<'static>>> = RefCell::new(None));

fn get_thread_jni_env() -> JNIEnv<'static>
{
    // This doesn't produce errors.
    JNI_ENV.with(|jni_env| jni_env.borrow().clone()).expect("JNI_ENV is None")
}

// [...]

#[no_mangle]
pub extern "system" fn Java_bluefire_106_mimosa_1computers_lua_DLLHandler_createComputerEnvDLL(_env: JNIEnv, _class: JClass)
{

    let _t: JoinHandle<()> = thread::spawn(move|| {
        // Here's the problematic line
        JNI_ENV.with(|jni_env: &RefCell<Option<JNIEnv>>| *jni_env.borrow_mut() = Some(_env.clone()));

        // Keep thread running
        loop {
            
        }
    });
}

And here's the error:

error[E0277]: `*mut *const JNINativeInterface_` cannot be sent between threads safely
   --> src\lib.rs:95:44
    |
95  |       let _t: JoinHandle<()> = thread::spawn(move|| {
    |                                ------------- ^-----
    |                                |             |
    |  ______________________________|_____________within this `[closure@src\lib.rs:95:44: 95:50]`
    | |                              |
    | |                              required by a bound introduced by this call
96  | |         JNI_ENV.with(|jni_env: &RefCell<Option<JNIEnv>>| *jni_env.borrow_mut() = Some(_env.clone()));
97  | |         
98  | |         let mut lua: Lua = Lua::new();
...   |
107 | |         }
108 | |     });
    | |_____^ `*mut *const JNINativeInterface_` cannot be sent between threads safely
    |
    = help: within `[closure@src\lib.rs:95:44: 95:50]`, the trait `Send` is not implemented for `*mut *const JNINativeInterface_`
    = note: required because it appears within the type `JNIEnv<'_>`
note: required because it's used within this closure
   --> src\lib.rs:95:44
    |
95  |     let _t: JoinHandle<()> = thread::spawn(move|| {
    |                                            ^^^^^^
note: required by a bound in `spawn`
   --> D:\rust\.rustup\toolchains\stable-x86_64-pc-windows-msvc\lib/rustlib/src/rust\library\std\src\thread\mod.rs:662:8
    |
662 |     F: Send + 'static,
    |        ^^^^ required by this bound in `spawn`

The error is telling you that you can't send raw pointers to other threads. JNIEnv contains a raw pointer, so it is !Send. In fact, you can see it in the docs: JNIEnv in jni - Rust (docs.rs)

Even if you make it Send, you shouldn't be trying to extend its lifetime to 'static. This is just a round-about way of trying to do the get-raw-pointer-and-recreate-later dance. The compiler is preventing you from shooting yourself in the foot.

1 Like

I was blindly clicking the "quick fix" button in VS Code. Given how entirely wrong its suggestions usually are (even with the rust-analyzer extension installed), I don't know why I trusted VS Code to give me sensible code. Also, your reply has resulted in a non-insignificant amount of internal screaming about how lifetime annotations make no sense whatsoever. I've managed to figure them out for the more basic cases, but the specific case of the thread_local! macro, they make no sense. Where on Earth are you supposed to actually define the lifetime within the macro? Nothing makes any sense.

When you use with to get a reference to a thread-local, that reference cannot escape the with closure under any circumstances. That's what the lifetimes on the method do.

You cannot store a value with a lifetime in a thread-local either.

To a first approximation, each lifetime annotation represents a stack frame that holds some data that the type references. The entire lifetime machinery is designed to keep track of this and ensure that references to the stack (or data transitively owned by the stack) are destroyed before their referent goes out of scope.

If you are doing something that can’t be tracked statically this way, like storing a value in a global or sending it to a different thread, the only lifetime you can use in safe code is 'static. Which mostly means that there are no references¹ inside the value. If you can prove that there will never be a dangling pointer in practice, then you can use unsafe to override the compiler’s analysis— That’s what it’s there for.

¹ For this purpose, things like Arc don’t count as references because they use a different mechanism to ensure they’re always valid.

1 Like

The thing about lifetimes is that you only define them implicitly based on ownership. The lifetime annotation is a way you can tell the compiler how lifetimes relate to one another. They also serve a dual purpose that tells the programmer that something is borrowed, but that's about it. An annotation cannot magically make something live longer (or shorter) than whatever its owner decides.

If you approach lifetimes thinking they let you control something, then no wonder it's confusing! You're not fighting the borrow checker, you're fighting an opportunity to learn.


Given your design that executes code in the JVM from a different thread, I would err on the side of caution and say that is probably going to be full of UB if you use unsafe. Even moving the initial thread into Java, the best you can do is get some JNIEnv reference for that thread. How can you use that reference from another thread that calls into lua?

JNIEnv is by definition not thread safe. A cursory search turns up several results affirming this. Here are a few:

  1. Are JNI functions thread-safe?

The JNI interface pointer (JNIEnv *) is only valid in the current thread. You must not pass the interface pointer from one thread to another, or cache an interface pointer and use it in multiple threads. The Java Virtual Machine will pass you the same interface pointer in consecutive invocations of a native method from the same thread, but different threads pass different interface pointers to native methods.

  1. FrankStain/jnipp: Jni++ : Thread-safe and type-safe C++11 wrap for Android JNI. (github.com)

The user of JNI must manually control the lifetime of Java Environment (JNIEnv instances) for each of native threads the user prefer to use JNI. This fact implicitly forces the C++ developers to abandon the native threads and use only Java threads. This is a huge disadvantage of JNI.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.