I’m new to using Rust. Does anyone know a relatively painless way to map a large amount of complex Structs to SQLite tables in Rust?
I have asked in the Rust Discord, and I got recommended seaORM
, however I really don’t like that I have to go through and rename all my Struct names to Model
, and because of this unique name, I now need to refactor my 30+ something structs to be in separate files, not to mention the implementations for converting the nested values into seaORM values.
I’m looking for a way to use my structs inside program as they are (or without a huge library api specific refactor) whilst also using them as the templates for my SQLite database’s tables and migration, if that makes sense.
To clarify, I want to use my Structs as the tables so using sqlx’s query_as!()
should directly work with my structs, AKA, mapped 1 to 1.
As I said, I am kind of aware this might not be possible (and kind of a dumb question), so if there’s nothing that could let me exactly use my Structs relatively simply, like with macros (or something of the sort, I know it cannot be that simple), I would rather just do it by hand. So a “No that is not possible” will work for me, I just wanted to ask on stackoverflow one last time to make sure there really is no simple way to solve my problem.
This is just one of many files full of structs that are nested like TermDictionaryEntry
… I even cut this one file down so it wasn’t so large.
use crate::dictionary_data::TermGlossaryContent;
use std::collections::HashMap;
#[allow(dead_code)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct KanjiStat {
name: String,
category: String,
content: String,
order: u16,
score: u64,
dictionary: String,
value: NumOrStr,
}
#[allow(dead_code)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct DictionaryOrder {
index: u16,
priority: u16,
}
#[allow(dead_code)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct KanjiFrequency {
index: u64,
dictionary: String,
dictionary_index: u16,
dictionary_priority: u16,
character: String,
frequency: NumOrStr,
display_value: Option<String>,
display_value_parsed: bool,
}
pub type KanjiStatGroups = HashMap<String, Vec<KanjiStat>>;
#[allow(dead_code)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct KanjiDictionaryEntry {
entry_type: DictionaryEntryType,
character: String,
dictionary: String,
onyomi: Vec<String>,
kunyomi: Vec<String>,
tags: Vec<Tag>,
stats: KanjiStatGroups,
definitions: Vec<String>,
frequencies: Vec<KanjiFrequency>,
}
/// Frequency information corresponds to how frequently a term appears in a corpus,
/// which can be a number of occurrences or an overall rank.
#[allow(dead_code)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct TermFrequency {
index: u32,
headword_index: u32,
dictionary: String,
dictionary_index: u16,
dictionary_priority: u16,
has_reading: bool,
frequency: u32,
display_value: Option<String>,
display_value_parsed: bool,
}
/// A dictionary entry for a term or group of terms.
#[allow(dead_code)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct TermDictionaryEntry {
/// The type of the entry.
entry_type: TermSourceMatchSource,
/// Whether or not any of the sources is a primary source. Primary sources are derived from the
/// original search text, while non-primary sources originate from related terms.
is_primary: bool,
/// Ways that a looked-up word might be an inflected form of this term.
inflection_rule_chain_candidates: Vec<InflectionRuleChainCandidate>,
/// A score for the dictionary entry.
score: i32,
/// The sorting value based on the determined term frequency.
frequency_order: u32,
/// The index of the dictionary in the original list of dictionaries used for the lookup.
dictionary_index: u32,
/// The priority of the dictionary.
dictionary_priority: u32,
/// The number of primary sources that had an exact text match for the term.
source_term_exact_match_count: u32,
/// The maximum length of the original text for all primary sources.
max_original_text_length: u32,
/// Headwords for the entry.
headwords: Vec<TermHeadword>,
/// Definitions for the entry.
definitions: Vec<TermDefinition>,
/// Pronunciations for the entry.
pronunciations: Vec<TermPronunciation>,
/// Frequencies for the entry.
frequencies: Vec<TermFrequency>,
}
After asking a bunch of people on Discord and looking it up online, I’m quite sure the least painless but most time consuming way would probably be just create the tables myself using sqlx.
However migrating a large amount of Structs like this would be quite a pain if i ever need to change them heavily, that is why I am looking for a relatively simple way to have the SQLite tables follow my Structs as templates, so I can simply drop my table, update the Struct’s values, migrate, and the SQLite table(s) should match up.