Hi,
I've a question on how to model a sequence of optional values, where the n
-th value depends on all previous n-1
values.
For context: in a GUI application there is a data type
struct Data {
subdata_1: Option<i32>,
subdata_2: Option<i32>,
subdata_3: Option<i32>,
}
Initially, all subdata_
fields are None
. While the user interacts with the program and inputs data, subdata_1
becomes Some(...)
, after some more interactions, subdata_2
becomes Some(...)
(depending on subdata_1.unwrap()
), and so on.
Thus, for n < m
, when subdata_m
is Some(...)
, then all previous subdata_n
must evaluate to Some(...)
as well. Consequently, the following is no valid Data
instance:
Data {
subdata_1: None,
subdata_2: Some(...),
subdata_3: None,
}
I'd really like to model the invariant above in the data type, such that invalid instances are un-representable. Something like
struct Data (Option<(i32, Option<(i32, Option<i32>)>)>);
which obviously is unreadable and hard to maintain.
Note that the actual types of subdata_
differ, i.e., storing them in a vector is not an Option
(pun intended).
Do you have any idea on how to do this?