0

I want to host 2 websites (Asp.net MVC) they have one folder with the same name and I want to copy data from one website to another periodically. For example website1/file/ to website2/file/.

That's why I thought to create a Windows service in order to do that. My question is how can I copy data between these two folders via http.

  • Why use http to copy files between two folders on the same server? You can use a program such as rsync and run it with a cron job. – Håken Lid Nov 27 '16 at 20:53
  • Do you have a unique file naming scheme in place? otherwise you could possibly overwrite files. – Adam Carr Nov 27 '16 at 21:25
  • How much access do you have to the host operating systems as well. What are they Windows 2012 R2? Hard to answer exactly without more information – Adam Carr Nov 27 '16 at 21:26
  • Haken the hostning support asked me to make it with http. can your give me more informations about your suggestion. thanks in advance. – Amine Nadori Dec 01 '16 at 19:51
  • @Adam Carr Yes i already have a naming system for files, what i am looking for how copies the data (videos, audio, pdf ..) between the 2 webs – Amine Nadori Dec 01 '16 at 19:53
  • To do so will not be trivial. Can you post up what you have tried? – Adam Carr Dec 01 '16 at 20:26
  • Also what .net version do you have installed on the servers? – Adam Carr Dec 01 '16 at 20:27
  • @Adam Carr i have host these two websites in smarterasp.net the framwork is i have already asked the support if they have a solution but the answer is to develop that manually and do that with http – Amine Nadori Dec 04 '16 at 14:43
  • Are you allowing updates or overwriting to the file? e.g. user uploads a video file and then needs to re upload it replacing the old copy. – Adam Carr Dec 06 '16 at 22:21

1 Answers1

0

Personally with the complexity around developing a solution I would look to use some kind of service like DropBox.

Another alternative would be to store the files in a distributed file system. This could be Amazon S3 or Azure Blob Store. This eliminates the need for the entire synchronization in the first place. This can be fronted by a proxy web service that can stream the file to the end user.

The reason I suggest this is because there is a lot of complexity around managing the synchronization of files via HTTP.

I don't think you will get a full solution on StackOverflow but I can make some recommendations.

  1. I would use a master-slave system to co-ordinate synchronization. This would require some design and add to the complexity. But would give you the ability to add more nodes in the future. Implementing a master-slave system can't be easily detailed in a single post and would require you to research it further. There is good resource on here already. How to elect a master node among the nodes running in a cluster?
  2. Calculating delta's for each node. e.g. What files do I have the master does not? What files does the master have that I do not. Are their naming conflicts on other nodes? How to determine what is the most upto date file?
  3. Transfering the files.. Will require some sort of endpoint to connect to either as part of the service or as your existing website.
  4. Http Client to send the files and handle progress/state of transfer for error handling.
  5. Error handling over all, what happens if a file is part transfered to the Master and how to clean up failed files.

That is probably the tip of the complexity of trying to do this. Hence my recommendations of using an existing product or cloud service.

Community
  • 1
  • 1
Adam Carr
  • 678
  • 4
  • 13