But if you want to run it minimized, then you have to make adjustments in the Actions tab.Īdd arguments (optional): /c start /min "My title" C:\Aplikacijas\NameOfMyFile.bat ^& exit Here is a simple example of how to schedule batch files in Windows Task Scheduler. Run batch file minimized from Windows Task Scheduler That is about 1.3 minutes total time - Almost half the time.If you are running scheduled Windows batch files, then sometimes it could be more convenient to run batch file minimized. To transfer a 5gb file at 120MB per second, it takes 0.66 minutes. At 200mbps (giving clouddrive the best chance) a 5gig file takes 03:34. That means that regardless of the amount of jumps, Filestream is much faster. Additionally, having speeds of 150-200mbps on a good day for clouddrive is much much less than the 1gbps + that filestream can do. After all - Google is an advertising company and more information the better. How about a thought experiment? If you were to release a software for your own cloud platform, would you optimize it for API users or your own product where you can see what files and patterns your users are performing. While in theory you are correct in that it adds an extra step in the process, the speeds that Stablebit is able to obtain is much less than what google's official Filestream solution can do. Finally, two network flows (cloud -> local -> cloud) instead of one. Therefore, it first would get downloaded and stored into the local FS cache, from where it then would get uploaded to the FS cloud (Google Drive). What about a thought experiment? If CloudDrive would allow you to put the cache on a file stream drive or any other cloud-based file system (and there are good reasons, why it does not), what would happen with one chunk of data from the CloudDrive you want to read? That chunk needs to be downloaded from the CloudDrive and put into the CloudDrive cache, which we now assume to be located on the FS drive. Where do you think gets file stream data stored, before you can access it? In a cache folder on your local drive!
My only alternative to this setup is to manually encrypt my files, upload, and re-download again as it is vastly quicker at 1gbps than 50mbps. Since google filestream is mounted as a fat32, having 20mb chunks will be optimal to not hit the 4gb limit of that file format.
I would like to remove this limitation so I can test this setup. I can't accomplish this as clouddrive will not let me save to non-local sources that are not physically connected internally. Solving this issue : I would like to combine Google filestream (I get speeds of 950 mbps from filestream - loading is basically instant) and clouddrive local stored on the filestream mounted system. The product isn't inherently bad, but with google drive speed limitations while using clouddrive, it has become unusable. My max speeds are terrible too with a whopping average of 50 mbps download before hitting throttling. I do not upload more than 1 gb a day normally and the software is on idle, so I know its not hitting the API limit from google. With a 1gbps line up and down, I can't seem to keep cloud drive happy and disconnects are daily at this point. As an owner of 3 cloud drive licenses, I'm pretty disappointed with the current performance of the product with google drive.