I will create an app that supports purchase records.
I used ‘SwiftData’ to implement storage, backup, and restoration functions at once.
One problem is that it takes a lot of time to read the data through SwiftData. It takes about 2 minutes to read about 800 pieces of data.
There are not many ways I know.
For servers, I understand that Paging is used to move data efficiently.
Realm does not implement paging, but supports it at high speed through Lazy Data.
What should I do with Swift Data? How can I read a lot of data efficiently??
Below is the model I created.
@Model
class StoreModel {
var id: String = UUID().uuidString
var name: String = "None"
var buyCount: Int = 0
var buy: [BuyModel]?
var regDate: Date?
init(name: String,
buyCount: Int,
buy: [BuyModel],
regDate: Date) {
self.id = UUID().uuidString
self.name = name
self.buyCount = buyCount
self.buy = buy
self.regDate = regDate
}
}
@Model
class BuyModel {
var id: String = UUID().uuidString
var type: Int = 0
var amount: Double = 0
var eachPrice: Double = 0
var store: StoreModel?
init(type: Int,
amount: Double,
eachPrice: Double,
regDate: Date = Date()) {
self.id = UUID().uuidString
self.type = type
self.amount = amount
self.eachPrice = eachPrice
}
}
Below is how I add data.
@MainActor func addStore(name: String,
buyCount: Int,
buy: [BuyModel],
regDate: Date) -> AnyPublisher<Bool, Error> {
return Future<Bool, Error> { promise in
let store = StoreModel(name: name,
buyCount: buyCount,
buy: buy,
regDate: regDate)
do {
self.container.mainContext.insert(store)
try self.container.mainContext.save()
promise(.success(true))
} catch {
print("error: (error.localizedDescription)")
promise(.failure(error))
}
}.eraseToAnyPublisher()
}
@MainActor func addBuy(storeId: String,
buy: BuyModel) -> AnyPublisher<Bool, Error> {
return Future<Bool, Error> { promise in
do {
let descriptor = FetchDescriptor<StoreModel>()
let stores = try self.container.mainContext.fetch(descriptor)
if let store = stores.first(where: { $0.id == storeId }) {
store.buy?.append(buy)
try self.container.mainContext.save()
promise(.success(true))
} else {
promise(.failure(NSError(domain: "", code: -1,
userInfo: [NSLocalizedDescriptionKey: "Store not found"])))
}
} catch {
promise(.failure(error))
}
}.eraseToAnyPublisher()
}
This is the code I use to bring data.
I didn’t implement logic to get Buy because there is Buy inside when I get Store.
@MainActor func getAllStore() -> AnyPublisher<[StoreModel], Error> {
return Future<[StoreModel], Error> { promise in
do {
let sortDescriptor = SortDescriptor(StoreModel.regDate, order: .reverse)
let descriptor = FetchDescriptor<StoreModel>(sortBy: [sortDescriptor])
let store = try self.container.mainContext.fetch(descriptor)
promise(.success(store))
} catch {
promise(.failure(error))
}
}.eraseToAnyPublisher()
}
@MainActor func getStore(storeId: String) -> AnyPublisher<StoreModel, any Error> {
return Future<StoreModel, Error> { promise in
do {
let descriptor = FetchDescriptor<StoreModel>()
let stores = try self.container.mainContext.fetch(descriptor)
if let store = stores.first(where: { $0.id == storeId }) {
promise(.success(store))
} else {
promise(.failure(NSError(domain: "", code: -1,
userInfo: [NSLocalizedDescriptionKey: "Store not found"]))).
}
} catch {
promise(.failure(error))
}
}.eraseToAnyPublisher()
}
And I generated each model through the for statement to generate the test data, and I injected the data by repeating it in a random number or generating a value.
I generated up to 100 store data and quickly imported them. There’s about 800 Buy data in each store. This is very slow.
I visualized the data through UITableView.