Search Results

Search found 14693 results on 588 pages for 'azure storage tables'.

Page 78/588 | < Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >

  • Combine auto-syncing cloud and VCS

    - by ComFreek
    This question brought me to another question: is there any VCS/tool for a VCS which automatically backups your source code between the last checkout and current changes? I had the problem of loosing uncommited source code changes just one week ago. I did not want to commit yet because the changes were incomplete. But then, an error when moving the data to an USB stick caused the data loss. That's the opposite what a cloud service (like Google Drive, SkyDrive, DropBox, ...) does: it tracks each change you made! Have you lost your data? That's no problem because you have the latest version online. So what would a combined solution look like? It would offer full functionality of a VCS including auto-syncing of any intermediate changes between two commits/checkouts to a temporary online location.

    Read the article

  • Is there any way to optimize my search blob program?

    - by Vicky
    I written this code to search the blob items (text files) on the basis of there content. For ex : if I search for "Good", then the files that contains "Good or good" word the name of that files should appear in search result. My code is working but i want to optimize it. class BlobSearch { public static int num = 1; static void Main(string[] args) { string accountName = "accountName"; string accessKey = "accesskey"; string azureConString = "DefaultEndpointsProtocol=https;AccountName=" + accountName + ";AccountKey=" + accessKey; string blob = "MyBlobContainer"; string searchText = string.Empty; Console.WriteLine("Type and enter to search : "); searchText = Console.ReadLine(); CloudStorageAccount account = CloudStorageAccount.Parse(azureConString); CloudBlobClient blobClient = account.CreateCloudBlobClient(); CloudBlobContainer blobContainer = blobClient.GetContainerReference(blob); blobContainer.FetchAttributes(); var blobItemList = blobContainer.ListBlobs(); GetBlobList(searchText, blobContainer, blobItemList); Console.ReadLine(); } private static async void GetBlobList(string searchText, CloudBlobContainer blobContainer, IEnumerable<IListBlobItem> blobItemList) { foreach (var item in blobItemList) { string line = string.Empty; CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference(item.Uri.ToString()); if (blockBlob.Name.Contains(".txt")) { await Search(searchText, blockBlob); } } } private async static Task Search(string searchText, CloudBlockBlob blockBlob) { string text = await blockBlob.DownloadTextAsync(); if (text.ToLower().IndexOf(searchText.ToLower()) != -1) { Console.WriteLine("Result : " + num + " => " + blockBlob.Name.Substring(blockBlob.Name.LastIndexOf('/') + 1)); num++; } } } I think blobContainer.ListBlobs(); is blocking code because search will not work until all the blob items loaded. Is there anyway to optimize it or anywhere else in my code. Thanks

    Read the article

  • Where will the image be saved? [closed]

    - by Dummy Derp
    import java.awt.Dimension; import java.awt.Rectangle; import java.awt.Robot; import java.awt.Toolkit; import java.awt.image.BufferedImage; import javax.imageio.ImageIO; import java.io.File; ... public void captureScreen(String fileName) throws Exception { Dimension screenSize = Toolkit.getDefaultToolkit().getScreenSize(); Rectangle screenRectangle = new Rectangle(screenSize); Robot robot = new Robot(); BufferedImage image = robot.createScreenCapture(screenRectangle); ImageIO.write(image, "png", new File(fileName)); } ... Going over this code I found on the internet. I got everything except the part where file is created. In what format file name should be? Should it be C:/myFolder/myImage.png" or just myImage.png and where will it be saved? Here is what docs say: File public File(String pathname) Creates a new File instance by converting the given pathname string into an abstract pathname. If the given string is the empty string, then the result is the empty abstract pathname. Parameters: pathname - A pathname string Throws: NullPointerException - If the pathname argument is null

    Read the article

  • Microsoft sort la version 1.4 du SDK de Windows Azure, qui corrige de nombreux bogues de la version précédente

    Microsoft sort la version 1.4 du SDK de Windows Azure, qui corrige de nombreux bogues de la version précédente L'équipe de développement de Windows Azure vient de rendre disponible la nouvelle version de son SDK. Ainsi, Windows Azure SDK 1.4 est téléchargeable et apporte quelques mises à jour. Il corrige notamment quelques problèmes qui se trouvaient dans la version précédente (1.3), parmi lesquels : - correction de l'échec de l'IIS lors de la configuration du fichier web.config en lecture seule - correction des erreurs dans les packs IIS qui les faisaient doubler en taille lorsque packagés - correction du recyclage de la totalité d'un IIS web role quand le diagnotic sto...

    Read the article

  • What will be the better way for data retrieval on application that needs to handle limited amount of data?

    - by Milanix
    Just moved this question from Stack Overflow. Since, adding my code snippets itself would make this question really long. Instead, I am pretty interested in knowing a better ways for data retrieval on application that needs to handle limited amount of data which isn't updated regularly. Let's take this example: I am writing an application which gets a schedule as an XML from server. I have written a logic in order to parse XML version and update database only if the version is newer than the local version. Although the update is checked automatically/manually on daily basis based on user preference, the actual version update happens only once per few months or so. Since, this is done by some other authority which doesn't provide API but, rather inform publicly on their changes. The actual XML contains a "(n number of groups)(days in a week) (n number of schedule)" . The group is usually 6 and the number of schedule is usually 2. So basically there would usually be only around 100 strings. Now although I have used SQLite at the moment. I want to know how to make update on database. Should I show progress dialog that the application is updating and exit the app when it's done? Since, my updates are infrequent i don't think this will really harm user experience but, is there any better ways to do it? Because I don't want update to be made when user is searching which is done using database. This will cause an database already open exception. At least I have faced this problem before. Is it better to rather parse XML every time when user wants to view certain things or to use SQLite? Since, I make lots of use of adapter in my app to create lists, will that degrade the performance?

    Read the article

  • What will be the better way for data retrieval on application that needs to handle limited amount of data.?

    - by Milanix
    This is not really a coding question since, I am not adding any code in here. Since, adding my code snippets itself would make this question really long. Instead, I am pretty interested in knowing a better ways for data retrieval on application that needs to handle limited amount of data which isn't updated regularly. Let's take this example: I am writing an application which gets a schedule as an XML from server. I have written a logic in order to parse XML version and update database only if the version is newer than the local version. Although the update is checked automatically/manually on daily basis based on user preference, the actual version update happens only once per few months or so. Since, this is done by some other authority which doesn't provide API but, rather inform publicly on their changes. The actual XML contains a "(n number of groups)(days in a week) (n number of schedule)" . The group is usually 6 and the number of schedule is usually 2. So basically there would usually be only around 100 strings. Now although I have used SQLite at the moment. I want to know how to make update on database. Should I show progress dialog that the application is updating and exit the app when it's done? Since, my updates are infrequent i don't think this will really harm user experience but, is there any better ways to do it? Because I don't want update to be made when user is searching which is done using database. This will cause an database already open exception. Atleast I have faced this problem before. Is it better to rather parse XML every time when user wants to view certain things or to use SQLite? Since, I make lots of use of adapter in my app to create lists, will that degrade the performance? It would really be a great help if anyone can give me better overview about it. Or may be counter argument against each. Many thanks!

    Read the article

  • Why is Web SQL database deprecated?

    - by user221287
    I am making a hybrid Android app. At first I decided to use localStorage, after spending 2 days, I realized that it is very strange and so dropped it. Then, I picked up indexedDB, after spending today's whole day and actually getting the output in Google Chrome, it is not running inside a WebView of the android app. And I never used Web SQL database at all because it was deprecated. Anyhow, it has come to my notice that PhoneGap still uses Web SQL and android's browsers support it. Why was Web SQL deprecated in the first place? And will it be a good idea for me to go with Web SQL now?

    Read the article

  • Nouveau Windows Azure : des machines virtuelles persistantes sous Linux, du IaaS, et encore plus de technos open-sources supportées

    Nouveau Windows Azure : des machines virtuelles persistantes sous Linux Du IaaS, et encore plus de technologies open-sources supportées Windows Azure, la plateforme Cloud de Microsoft dédiée aux développeurs, continue sa montée en puissance. Depuis hier, plusieurs nouveaux services ont été officiellement annoncés. Parmi ceux-ci, un des plus attendus (et qui a alimenté le plus de rumeurs) est l'arrivée de machines virtuelles - persistantes ? capables de faire tourner des distributions Linux (Ubuntu, OpenSuse, CentOS, SUSE Linux Enterprise Server). Azure combine à présent des services d'infrastructure et de plateforme pour « une plus grande souplesse dans la façon de cons...

    Read the article

  • Strategy to store/average logs of pings

    - by José Tomás Tocino
    I'm developing a site to monitor web services. The most basic type of check is sending a ping, storing the response time in a CheckLog object. By default, PingCheck objects are triggered every minute, so in one hour you get 60 CheckLogs and in one day you get 1440 CheckLogs. That's a lot of them, I don't need to store such level of detail, so I've set a up collapsing mechanism that periodically takes the uncollapsed CheckLogs older than 24h and collapses (averages) them in intervals of 30 minutes. So, if you have 360 CheckLogs that have been saved from 0:00 to 6:00, after collapsing you retain just 12 of them. The problem.. well, is this: After averaging the response times, the graph changes drastically. What can I do to improve this? Guess one option could be narrowing the interval duration to 15 min. I've seen the graphs at the GitHub status page and they do not seem to suffer from this problem. I'd appreciate any kind of information you could give me about this area.

    Read the article

  • Microsoft ajoute un nouveau service de Build à la version hébergée de Team Foundation Server permettant de compiler sur Windows Azure

    Microsoft ajoute nouveau service de Build à la version hébergée de Team Foundation Server permettant de compiler le code source sur Windows Azure Microsoft a procédé à une mise à jour de la version hébergée de Team Foundation Server sur Windows Azure. Team Foundation Server est une solution de travail collaboratif permettant la gestion des sources, des builds, le suivi des éléments de travail, la planification et l'analyse des performances. Il y a un an de cela, Microsoft avait porté la solution sur sa plateforme d'hébergement Cloud Windows Azure. La société vient d'annoncer qu'un nouveau service de Build a été ajouté à Team Foundation. Ce service réduit le...

    Read the article

  • Azure VS Tools and SDK - systray already running&hellip;

    - by Shawn Cicoria
    If you are getting a message when you start the Compute Emulator “Systray already running…” from within Visual Studio one fix is to check what the image name is loading is. For some reason, on 2 of my machines the image was loading with the 8.3 format.  This caused the logic in the VS tools to not find the process.  So, to fix, I just did a little copy/rename magic. C:\Program Files\Windows Azure SDK\v1.3\bin>copy csmonitor.exe csmonitor-a.exe 1 file(s) copied. C:\Program Files\Windows Azure SDK\v1.3\bin>del csmonitor.exe C:\Program Files\Windows Azure SDK\v1.3\bin>copy csmonitor-a.exe csmonitor.exe 1 file(s) copied. If you bring up task manager and see something like CSMON~1.EXE in the Image Name column, you probably have this issue.

    Read the article

  • How should a team share/store game content during development?

    - by irwinb
    Other than Dropbox, what out there has been especially useful for storing and sharing game content like images during development (similar feature set to Dropbox like working offline, automatic syncing and support for windows/osx)? We are looking into hosting our own SharePoint server but it seems to be really focused on documents... Maybe Box.net would work? EDIT For code, we are using Git. To be more precise, I was looking for an easy, automatic way for content produced by artists/audio engineers to be available to everyone. Features like approvals of assets don't hurt either. Following the answer linked by Tetrad, Alienbrain looked pretty interesting but..is way out of our budget (may be something to invest in in the future). What ended up doing... We were going to go with Box.net but downloading the sync apps for desktop use required us to wait to be contacted by them for some reason. We did not have much time to wait so we ended up going with Dropbox Teams. Box.net has a nice feature set but we never really felt held back without them. Thanks for the help :).

    Read the article

  • Storing Projects on Google Drive (Cloud)

    - by JamesKraw
    I've started using Google Drive for my cloud needs and backing up pretty much everything. I've got the app installed so it auto-sync's all my content in most things. My question is this, I am currently coding for iOS (although this applies to any coding project) and am split on storing my project files on Google Drive while using sync. My theory is that if I did use it, I'd never have to worry about system crashes or lost code before backups, but if I do use it it will be sync'ing a-lot and I thought there might be problems with it detecting changes and trying to sync for example half way through compiling. Bandwidth isn't an issue as I have fast connection and unlimited monthly allowance. Has anyone ever used this, or similar cloud-based sync'ing (dropbox etc) for this and knows whether it works or not or whether there are any potential problems etc.

    Read the article

  • Top Exastack Partner Headlines - 7 Nov

    - by Roxana Babiciu
    R-Style Softlab recommends RS-solutions with Exadata and SuperCluster to improve business performance for their customers. Read more. WingArc, a subsidiary of 1st Holdings, Inc., finds Oracle Exadata’s substantial computing power enabling to their enterprise output management solution (SVF/RD). It helps solve the problem of providing an integrated output platform for high volume businesses. Read more.

    Read the article

  • Le nouveau Windows Azure est disponible depuis ce matin, la version dévaluation de 90 jours est toujours valide

    Nouveau Windows Azure : des machines virtuelles persistantes sous Linux Du IaaS, et encore plus de technologies open-sources supportées Edit du 08/06/12 : Le nouveau Windows Azure est disponible depuis ce matin (0h15 heure de Paris) Windows Azure, la plateforme Cloud de Microsoft dédiée aux développeurs, continue sa montée en puissance. Depuis hier, plusieurs nouveaux services ont été officiellement annoncés. Parmi ceux-ci, un des plus attendus (et qui a alimenté le plus de rumeurs) est l'arrivée de machines virtuelles - persistantes ? capables de faire tourner des distributions Linux (Ubuntu, OpenSuse, CentOS, SUSE Linux Enterprise Server). Az...

    Read the article

  • MySQL tables corrupted and need to be repaired,how to fix them and how to prevent this [on hold]

    - by Hbirjand
    We have developed a software that uses,MySQL database,and after some month of usage,some of the database tables had been corrupted and need to be repaired,what is the cause of this corruption? How can we fix them? How can we prevent the database tables from corruption. Because of the corruption,we are not able to create a backup using Navicat,and we can not generate a dump too. Is that possible to fix this tables using MySQL engine,and some of this corruption is about empty fields in database tables.

    Read the article

  • Oracle Virtual Compute Appliance Launch Channel Update Webcast - May 28

    - by Roxana Babiciu
    Join us for an Oracle Virtual Compute Appliance (OVCA) launch update for the channel. This training webcast is a follow up to the OVCA launch on April 16. We will provide a brief product overview of OVCA followed by some great OPN program content, resell criteria, OPN Incentive Program and Demo Equipment Program details. There will be two sessions to accommodate each region. Additionally, don’t miss the latest Oracle Virtual Compute Appliance article packed with great information!

    Read the article

  • Why use binary files to stack up different versions on DMSs?

    - by edgarator
    I've used both Liferay and Alfresco trying to use them as the Document Management System for an intranet. I noticed the following: They use the file system and the database to store files They use a GUID to name the file on the filesystem and that GUID is used as an Id in the database. The GUID-named file is a binary file The GUID-named binary file stores all versions for a given file The path for the file in the DMS doesn't match the one in the file system The URL makes reference to the GUID when a certain file is requested What I want to know is why is this, and what would be the best way of doing it. Like how to would you create the binary file (zip?), and what parts would you keep in the binary file and what parts would you store in the database (meta-data, path?). I'm assuming some of the benefits of doing it like this. As having the same URL for a file, regardless of its current document path. And having only one file even if the file has changed names over time.

    Read the article

  • Entity Framework and stored procedure returning temp table Issues

    - by kaplooeymom
    (Disclaimer - I'm not the database designer. I'm just the poor developer what has to make this work.) There are 17 (at the moment) tables with identical structure - name, address, phone number. Given a phone number, I have to check to see if there's a matching entry in any of the tables, then return that address. So, I created a view to get the list of tables (there's a ref table that holds that info), then I created a stored procedure to create a temp table, using cursors, check each table in the view for the phone number, using sql concatenation. If a record is found, insert it into the temp table. return the rows from the temp table. This all works in straight T-SQL. Now, I'm trying to use Entity Framework 4+ to call the stored procedure. But the function import interface won't generate columns. It says return type = none, and the LINQ code expects an int and won't compile. Any ideas on how to make this work? I know I can move the check tables part to code, if I absolutely have to, but I'd rather have the above method work.

    Read the article

  • Bing search API and Azure

    - by Gapton
    I am trying to programatically perform a search on Microsoft Bing search engine. Here is my understanding: There was a Bing Search API 2.0 , which will be replaced soon (1st Aug 2012) The new API is known as Windows Azure Marketplace. You use different URL for the two. In the old API (Bing Search API 2.0), you specify a key (Application ID) in the URL, and such key will be used to authenticate the request. As long as you have the key as a parameter in the URL, you can obtain the results. In the new API (Windows Azure Marketplace), you do NOT include the key (Account Key) in the URL. Instead, you put in a query URL, then the server will ask for your credentials. When using a browser, there will be a pop-up asking for a/c name and password. Instruction was to leave the account name blank and insert your key in the password field. Okay, I have done all that and I can see a JSON-formatted results of my search on my browser page. How do I do this programmatically in PHP? I tried searching for the documentation and sample code from Microsoft MSDN library, but I was either searching in the wrong place, or there are extremely limited resources in there. Would anyone be able to tell me how do you do the "enter the key in the password field in the pop-up" part in PHP please? Thanks alot in advance.

    Read the article

  • Cloud Backup: Getting the Users' Backs Up

    - by Tony Davis
    On Wednesday last week, Microsoft announced that as of July 1, all data transfers into its Microsoft Azure cloud will be free (though you have to pay for transferring data out). On Thursday last week, SQL Azure in Western Europe went down. It was a relatively short outage, but since SQL Azure currently provides no easy way to take a standard backup of a database and store it locally, many people had no recourse but to wait patiently for their cloud-based app to resume. It seems that Microsoft are very keen encourage developers to move their data onto their cloud, but are developers ready to do it, given that such basic backup capabilities are lacking? Recently on Simple-Talk, Mike Mooney described a perfect use case for the Microsoft Cloud. They had a simple web-based application with a SQL Server backend; they could move the application to Windows Azure, and the data into SQL Azure and in the process free themselves from much of the hassle surrounding management and scaling of the hardware, network and so on. It was a great fit and yet it nearly didn't happen; lack of support for the BACKUP command almost proved a show-stopper. Of course, backups of Azure databases are always and have always been taken automatically, for disaster recovery purposes, but these are strictly on-cloud copies and as of now it is not possible to use them to them to restore a database to a particular point in time. It seems that none of those clever Microsoft people managed to predict the need to perform basic backups of Azure databases so that copies could be stored locally, outside the Azure universe. At the very least, as Mike points out, performing a local backup before a new deployment is more or less mandatory. Microsoft did at least note the sound of gnashing teeth and, as a stop-gap measure, offered SQL Azure Database Copy which basically allows you to create an online clone of your database, but this doesn't allow for storing local archives of the data. To that end MS has provided SQL Azure Import/Export, to package up and export a database and its data, using BACPACs. These BACPACs do not guarantee transactional consistency; for example, if a child table is modified after the parent is copied, then the copied database will be in inconsistent state (meaning, to add to the fun, BACPACs need to be created from a database copy). In any event, widespread problems with BACPAC's evil cousin, the DACPAC have been well-documented, and it seems likely that many will also give BACPAC the bum's rush. Finally, in a TechEd 2011 presentation tagged "SQL Azure Advanced Administration", it was announced that "backup and restore" were coming in the next SQL Azure CTP. And yet this still doesn't mean that we'll get simple backups as DBAs know and love them. What it does mean, at least, is the ability to restore any given database to a point in time within a 2-week window. For the time being, if you want a local copy of your data and don't want to brave the BACPAC, one is left with SSIS or BCP, creative use of schema and data comparison tools, or use of SQL Azure Backup (currently in beta) in order to perform this simple but vital task. Cheers, Tony.

    Read the article

  • Connecting Dell PowerVault NAS to ESXi

    - by Matt Fitz
    Just got a Dell PowerVault NAS storage device running Windows Storage Server Standard which includes NFS. When I try and connect the ESXi server I get the following message: Call "HostDatastoreSystem.CreateNasDatastore" for object "ha-datastoresystem" on ESXi "powerhouse" failed. Operation failed, diagnostics report: Unable to complete Sysinfo operation. Please see the VMkernel log file for more details. I am pretty sure it is a username/password type thing. Not sure were to begin though. Also, I am planning on using "Username Mapping" instead of Active Directory. Any ideas would be greatly appreciated

    Read the article

  • many partitions on a single filegroup?¿ does it make sense?

    - by river0
    Hi, I'm designing a datawarehouse solution and I'm a newbie in disk configuration issues, let me explain you. Our storage is spread over 6 storage enlosures having each of them 5 raid-1 disk arrays, and having 2 LUNS defined per each disk array, which makes a total 48 LUNS (this is following Microsoft fast track recommendations for datawarehouse architectures). I would like to partition my data, on other projects I have worked before, we always followed a 1 partition - 1 filegroup rule. On the microsoft fast track recomendations it is advised to create a filegroup and then for that filegroup a data file per each lun... but I pretend to have a week level partitioning... if I apply that rule I think that I'll get too many files and a complex layout. I'm thinking of just creating just one filegroup (with the 48 lun data files), but still create the partitions since I want to keep soem of the benefits of partitions like partition switching... Is this scenario not recommended? What would you suggest?

    Read the article

  • Windows 2008 terminal server - How to restrict access to DVD/floppy?

    - by test1839
    I has a very simple task. I need to block access to removable media (CD, DVD, floppy, USB drives etc.) on a Windows 2008 R2 Terminal Server for users and allow it for admins. I tried to enable the following policy in GPO: User Configuration/Administrative Templates/System/Removable Storage Access All Removable Storage classes: Deny all access = Enabled But it did not work. I tried different physical and virtual 2008 servers with the same result. It works on Windows 7 but not on Windows 2008. Has anyone had success with this parameter on Windows 2008? Thank you

    Read the article

  • Question regarding an NAS server and remote users

    - by JB
    I have a client that requires a massive amount of storage space but: Doesn't want to spend very much money Needs remote users (across the country) to be able to pull data from it and store to it, as well. Can this be done with an NAS Server such as the Western Digital Sharespace Network Storage System? I do not believe the client wants to spend over $1400 and HP is offering 8TB for $1299. Also, if anyone has any other ideas besides using NAS, please let me know. Thanks in advance for any help.

    Read the article

< Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >