I have a scenario where I will be receiving messages through a socket connection. I also need to keep a list of 100 messages (such as List) and periodically toss out old messages from the list when a condition is met.
List<Message> MessageList = new List<Message>();
Message CurrentMessage = new Message();
public void ReceiveSocketMessage (SocketMessage Message) {
if (Message.Type == duplicate) {
CurrentMessage = MessageList.FirstOrDefault(x=> x.MessageText == Message.Text);
CurrentMessage.Time = DateTime.Now;
} else {
MessageList.Add(Message);
}
}
Since the socket sends messages very quickly, it seems like any other method that attempts to access MessageList will cause a problem. Other methods will need to access the list at many random points in time.
I just can’t get my head wrapped around working on the list of objects without causing trouble. What is the traditional method of handling this?
UPDATE: Would it be simpler just to maintain a list of items in a database table instead of a list of c# objects? A database would surely have all the major locking problems ironed out to a point where I shouldn’t need to worry about it.
5
Your guiding factors here should be how much traffic you expect to see through your queue, how important it is, and the relationship between reads and writes.
There are now some pretty handy built in collections in the .Net framework that can help you out with concurrency stuff and save a lot of time and effort but of course once you are playing with locking and unlocking resources you will almost certainly bump into some obscure blocking circumstances sooner or later. Certainly a ConcurrentQueue or similar should be able to help you with the problem you have outlined. Of course Queues expect a push-pop type of workflow, which may be what you want but if not, you may be well served looking into other thread safe collections.
If these are incoming messages that you essentially need to put into a “received” queue and then process, then you might look at using a message queue of some kind rather than simply dropping the data in the database- this can offer you the benefits of having your queue data stored in such a way that if the system goes down it should be recoverable ( which may or may not be important to you ) and it also potentially means if you need to distribute your workload in future you can simply plug other clients into the queue and process it as necessary. There are a bunch of Message Queue tools out there that may be able to help you, but obviously the need for this is rather dependent on your context.
I would encapsulate the standard list in a custom list class, which is to be made thread-safe. How it arrives at thread-safety is nicely encapsulated, and can the implementation can be tweaked for best performance given it’s typical usage.
First try if a simple lock on every access to the list gives acceptable performance. If so, you’re done!
If this is too slow, you should profile the list usage. Are there a lot of reads from multiple threads and only few writes? You could try ReaderWriterLock, which would allow the reads to be in parallel. And maybe some buffering to do a bulk update, which minimizes the write lock time.