I have 2GB files (9 of them) which contains approximately 12M records of strings that i want to insert each one as a document to local mongodb (windows).
Now i'm reading line by line and inserting every second line (the first is unnecessary header) like this:
bool readingFlag = false;
foreach (var line in File.ReadLines(file))
{
if (readingflag)
{
String document = "{'read':'" + line + "'}";
var documnt = new BsonDocument(
MongoDB
.Bson
.Serialization
.BsonSerializer
.Deserialize<BsonDocument>(document));
await collection.InsertOneAsync(documnt);
readingflag = false;
}
else
{
readingflag = true;
}
}
This method is working but not as fast as i expected. I'm now in the middle of the file and i assume it will end in about 4 hours for just one file. (40 hours for all my data)
I think that my bottleneck is the file reading but since it is very big file VS doesn't let my load it to memory (out of memory exception).
Is there any other way that i'm missing here?