Search Results

Search found 6759 results on 271 pages for 'backup strategies'.

Page 218/271 | < Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >

  • Non-blocking TCP buffer issues.

    - by Poni
    Hi! I think I'm in a problem. I have two TCP apps connected to each other which use winsock I/O completion ports to send/receive data (non-blocking sockets). Everything works just fine until there's a data transfer burst. The sender starts sending incorrect/malformed data. I allocate the buffers I'm sending on the stack, and if I understand correctly, that's a wrong to do, because these buffers should remain as I sent them until I get the "write complete" notification from IOCP. Take this for example: void some_function() { char cBuff[1024]; // filling cBuff with some data WSASend(...); // sending cBuff, non-blocking mode // filling cBuff with other data WSASend(...); // again, sending cBuff // ..... and so forth! } If I understand correctly, each of these WSASend() calls should have its own unique buffer, and that buffer can be reused only when the send completes. Correct? Now, what strategies can I implement in order to maintain a big sack of such buffers, how should I handle them, how can I avoid performance penalty, etc'? And, if I am to use buffers that means I should copy the data to be sent from the source buffer to the temporary one, thus, I'd set SO_SNDBUF on each socket to zero, so the system will not re-copy what I already copied. Are you with me? Please let me know if I wasn't clear.

    Read the article

  • database importing problem with sql server

    - by tibin mathew
    Hi, I have a database working in mu local sql server 2005 express edition. I have to import my local dtabase to a remote servr database. For that i established connection to that remote server, and i can now see that database . but when i tried to restore database fro my local machine i'm getting an error message when i tried to give backup file location. Below is the error message The EXECUTE permission was denied on the object 'xp_availablemedia', database 'mssqlsystemresource', schema 'sys'. The user does not have permission to perform this action. The statement has been terminated. (Microsoft SQL Server, Error: 229) waht is the problem, how can i solve this. Please help me

    Read the article

  • Svn - get the list of all repos on a server so I can svnsync

    - by egarcia
    I'm attempting to create a backup of my client's existing svn repositories, which is publicly available over http. If possible, I'd like to be able to make new repositories automatically, from any computer, without having to give console access to the server to external parties (i.e. the users could do a ls on my svn repo dir) My problem is that I need to know the list of svn repositories on the server - it isn't a fixed list, since the user will add new repositories over time. I'm able to list the repositories on an html page via Apache's mod_dav_svn module, using the SVNListParentPath On directive. I got this page: http://svn.ohwr.org/ My question is: what is the easiest way to obtain a usable list of such repositories? I'll need to parse that list in order to make syncs, probably using shell commands. Must I parse the HTML with shell commands, or is there a better way to get that list?

    Read the article

  • Best pratice: How do I implement a list that can be rendered both server-side and client-side?

    - by André Pena
    Technologies involved: ASP.NET Web-forms Javascript (jQuery for instance) Case To make it clearer let's give the Stackoverflow authors list as an example. This list can be manipulated at client-side. I can search, page and so forth. So obviously we would need to call jQuery.ajax to retrieve the HTML of each page given a search. Alright. Now this leaves me with the first question: What is the best way to render the response for the jQuery.ajax at server-side? I can't use templates I suppose, so the most obvious solution I think is to create the HTML tags as server-controls and render them as the result of an ASHX request? Is this is best approach? Nice. That solved we have yet another problem: When the user first enters the Authors List the first list page should already come from the server completely rendered alright? Of course we could render the first page as well as an ajax call but I don't think it's better. This time I CAN use templates to render the list but this template couldn't be reused in case 1. What do I do? Now the final question: Now we have 2 rendering strategies: 1) Client and 2) Server. How do I reuse code for the 2 renderings? What are the best pratices for solving these problems?

    Read the article

  • Best pratice: How do I implement a list similar to Stackoverflow's Users List?

    - by André Pena
    Technologies involved: ASP.NET Web-forms Javascript (jQuery for instance) Case To make it clearer let's give the Stackoverflow Users list as an example. This list can be manipulated at client-side. I can search, page and so forth. So obviously we would need to call jQuery.ajax to retrieve the HTML of each page given a search. Alright. Now this leaves me with the first question: What is the best way to render the response for the jQuery.ajax at server-side? I can't use templates I suppose, so the most obvious solution I think is to create the HTML tags as server-controls and render them as the result of an ASHX request? Is this is best approach? Nice. That solved we have yet another problem: When the user first enters the Authors List the first list page should already come from the server completely rendered alright? Of course we could render the first page as well as an ajax call but I don't think it's better. This time I CAN use templates to render the list but this template couldn't be reused in case 1. What do I do? Now the final question: Now we have 2 rendering strategies: 1) Client and 2) Server. How do I reuse code for the 2 renderings? What are the best pratices for solving these problems?

    Read the article

  • svn server synchronise automatically

    - by zapping
    I have a svn server on our lan locally its on windows. The developers use and check in/out from that. Just to be on the safer side we have took up a server from rackspace a linux one. Is it possible to do an automatic weekly synchronise from the local svn server to the remote one. The remote one will be mainly used as a remote backup but just in case if somebody wants to access then they can do as there is no static or external IP for our lan.

    Read the article

  • iCloud + Storage of media in iPhone Documents folder

    - by Michael Morrison
    I, like many developers, got an email from Apple recently that stated we should move our data from the documents directory into another folder to permit more streamlined backup to iCloud. In recent testing it appears that [your app] stores a fair amount of data in its Documents folder. Since iCloud backups are performed daily over Wi-Fi for each user's iOS device, it's important to ensure the best possible user experience by minimizing the amount of data being stored by your app. Marco Arment, of instapaper fame, has a good take on the issue, which is that the recommended location for storing downloadable files is in /Library/Caches. However, the problem is that both /tmp and /Caches can be 'cleaned' anytime the OS decides that the device is running low on storage. If your app is cleaned then the data downloaded by your app and stored by your user is gone. Naturally, the user will blame you and not Apple. What to do?

    Read the article

  • GAC behaviour

    - by pkolodziej
    I put signed dll into GAC. I delete this dll from folder where other applications could reach it. I try to run client app, which used that dll. Dll is immidiately put back to the original folder. How does it happen? I am guessing that GAC is monitoring folder and when it detects that dll is missing it puts the latest version back to the folder where other applications could reach it. If I am correct please tell me if GAC will automatically backup dll again if it will be rebuild.

    Read the article

  • Script or utility to export from ScrewTurn Wiki into Confluence?

    - by jrummell
    Has anyone tried migrating from ScrewTurn to Confluence? I'm hoping that I can export the ScrewTurn database to xml and then use a utility to put it in a format that Confluence can understand - perhaps the format used by the Universal Wiki Converter. Has anyone used such a utility? Is there anything I should know before I try to write one myself? Update: I've installed the Confluence trial and I took a look at the backup. There's an entities.xml file that looks like the data store. The root tag is <hibernate-generic>, which leads me to believe that they use Hibernate. I'm not familiar with Hibernate, but this should be useful.

    Read the article

  • Non-existent file in limbo prevents push to remote branch (Bazaar VCS)

    - by das_weezul
    Hi! I use Bazaar VCS to version files locally on my notebook. When im in the office I merge the changes to a repository on a windows share and also push all the files there (for backup reasons). My Problem: The last push resulted in an error, because I added a file with a very long filename (I had that problem before ... python doesn't like long filenames). So I removed the file (I didn't need it anyway) and forgot about the problem for a while, because commiting still worked fine. The next time I wanted to push my new revision I got a new error: bzr: ERROR: [Error 3] Das System kann den angegebenen Pfad nicht finden: u'//path/to/remote/branch/.bzr/checkout/limbo/new-8/loooooooongfilename.xls' translation: bzr: ERROR: [Error 3] The system can't find the following path: What I've tried: Deleting the limbo folder-- limbo folder doesn't exist Create the missing path with a dummy-file -- bazaar locks the branch -- unlock -- same problem as before bzr check -- Everything is fine -- No success bzr reconcile -- No success Thanks for reading ;o)

    Read the article

  • Magic behind R.java file

    - by LambergaR
    Hi! Recently I have been having quite some problems with R.java file. Now I have decided to do a backup and delete the file to see what happens. Nothing happened, so I created an empty R.java file and hopped for the best. Now Eclipse seems to figure out that the file was tempered with and even issues a warning: R.java was modified manually! Reverting to generated version! And that's all there is. I tried building it manually but got no results. So, I have two questions: 1. what should I do to force Eclipse to generate the file 2. what is happening here? How is the file created, where is the code that is generating the file? I would appreciate any help. As usual the problem occurred just a few days before the deadline :)

    Read the article

  • Best open source alternative for MS Visual Source Safe?

    - by afsharm
    We are leaving VSS for TFS or any other alternatives. I'm the one who persists to go for an open source alternative like SVN. Now I'm searching for a good open source Version Control regarding following aspects: We are in love with open source movement and cross-platform. Could it be possible to use it with Mono, SharpDevelop and Express editions of VS instead of Visual Studio itself? What about backup? Is it integrated with VS without serious problems? Any API or command prompt access? Please notice I've read following previous texts about it but still need more help: http://stackoverflow.com/questions/690766/vss-or-svn-for-a-net-project http://stackoverflow.com/questions/61959/tfs-vs-open-source-alternatives http://stackoverflow.com/questions/44588/how-to-convince-a-company-to-switch-their-source-control

    Read the article

  • Omniauth/Devise/Facebook: Auth route is not recognized

    - by M. Cypher
    I've been working on this problem for 7 hours now, and I still have no idea. Maybe one of you can help me. I'm simply trying to integrate the OAuth feature of Devise 1.2rc, which uses Omniauth, into my Rails application. I've been using this tutorial by Devise: https://github.com/plataformatec/devise/wiki/OmniAuth%3A-Overview I have done everything they tell you to... Yes, I have added the following line to my devise.rb: config.omniauth :facebook, "APP ID", "APP SECRET" I have added :omniauthable to my user model, as well as the class function as described in the tutorial I have implemented the omniauth_callbacks controller, as well as the callback function, and I have specified the omniauth_callbacks controller in my routes.rb When I run "rake middleware" it does list the Omniauth middleware: use OmniAuth::Strategies::Facebook I have installed Devise directly from the Git repo, master branch, so it's up-to-date I have installed Omniauth 1.2.0.beta5, which is the latest version. In my Gemfile it says: gem 'oa-oauth', '0.2.0.beta5', :require = 'omniauth/oauth' I have restarted the server, obviously However, when I try to request this URL: http://localhost:3000/auth/facebook it simply says ActionController::RoutingError (No route matches "/auth/facebook"): /user/auth/facebook doesn't work either. Since I unfortunately don't have the time to take apart the entire Omniauth and Devise gems and understand every line of code in them, maybe one of you could tell me what the problem might be.

    Read the article

  • Testing fault tolerant code

    - by Robert
    I’m currently working on a server application were we have agreed to try and maintain a certain level of service. The level of service we want to guaranty is: if a request is accepted by the server and the server sends on an acknowledgement to the client we want to guaranty that the request will happen, even if the server crashes. As requests can be long running and the acknowledgement time needs be short we implement this by persisting the request, then sending an acknowledgement to the client, then carrying out the various actions to fulfill the request. As actions are carried out they too are persisted, so the server knows the state of a request on start up, and there’s also various reconciliation mechanisms with external systems to check the accuracy of our logs. This all seems to work fairly well, but we have difficult saying this with any conviction as we find it very difficult to test our fault tolerant code. So far we’ve come up with two strategies but neither is entirely satisfactory: Have an external process watch the server code and then try and kill it off at what the external process thinks is an appropriate point in the test Add code the application that will cause it to crash a certain know critical points My problem with the first strategy is the external process cannot know the exact state of the application, so we cannot be sure we’re hitting the most problematic points in the code. My problem with the second strategy, although it gives more control over were the fault takes, is I do not like have code to inject faults within my application, even with optional compilation etc. I fear it would be too easy to over look a fault injection point and have it slip into a production environment.

    Read the article

  • File system to choose for NAS box based on FreeNAS or OpenFiler [closed]

    - by Chris
    I'm a photographer and I have a requirement find a better method of storing my photos other than multiple USB2 drives via USB hubs. Currently I use a Macbook Pro and 6 external drives connected via USB2 or FW800. 3 are a copy of the first three, kept up to day manually by running an rsync backup. I'd like to run a FreeNAS or OpenFiler NAS box using 2TB drives mirrored via software RAID. But - I would like to have the flexibility of also plugging into the drive physically for the faster throughput when necessary. So. My question is, is there a file system that both *nix and Mac OSX will play nice with? Many thanks, Chris.

    Read the article

  • Restoring database with SMO - Reporting progress problems

    - by madlan
    I'm using the below to restore a database in VB.NET. This works but causes the interface to lockup if the user clicks anything. Also, I cannot get the progress label to update incrementally, it's blank until the backup is complete then displays 100% Sub DoRestore() Dim svr As Server = New Server("Server\SQL2008") Dim res As Restore = New Restore() res.Devices.AddDevice("C:\MyDB.bak", DeviceType.File) res.Database = "MyDB" res.RelocateFiles.Add(New RelocateFile("MyDB_Data", "C:\MyDB.mdf")) res.RelocateFiles.Add(New RelocateFile("MyDB_Log", "C:\MyDB.ldf")) res.PercentCompleteNotification = 1 AddHandler res.PercentComplete, AddressOf ProgressEventHandler res.SqlRestore(svr) End Sub Private Sub ProgressEventHandler(ByVal sender As Object, ByVal e As PercentCompleteEventArgs) Label3.Text = "" ProgressBar.Value = e.Percent LblProgress.Text = e.Percent.ToString End Sub

    Read the article

  • SQL ConnectionString in global.asax overridden by web.config

    - by rlb.usa
    This is going to sound very odd, but I have a web.config like this: <connectionStrings> <remove name="LocalSqlServer"/> <add name="LocalSqlServer" connectionString="Data Source=BACKUPDB;..." providerName="System.Data.SqlClient"/> </connectionStrings> And a global.asax like this: void Session_Start(object sender, EventArgs e) { // Code that runs when a new session is started if (Application["con"] == null || Application["con"] == "") { Application["con"] = "Data Source=PRODUCTIONDB;..."; } } And EVERYWHERE in my code, I reference my ConnectionStrings like this: SqlConnection con = new SqlConnection(Convert.ToString(HttpContext.Current.Application["con"])); However, I see that everything I do inside this application goes to BACKUP db instead of PRODUCTIONDB. What is going on, how could this happen, and why? It doesn't make any sense to me, and it got me into a lot of trouble. We use LocalSqlServer string for FormsAuthentication.

    Read the article

  • Future proofing client-server code?

    - by Koran
    Hi, We have a web based client-server product. The client is expected to be used in the upwards of 1M users (a famous company is going to use it). Our server is set up in the cloud. One of the major questions while designing is how to make the whole program future proof. Say: Cloud provider goes down, then move automatically to backup in another cloud Move to a different server altogether etc The options we thought till now are: DNS: Running a DNS name server on the cloud ourselves. Directory server - The directory server also lives on the cloud Have our server returning future movements and future URLs etc to the client - wherein the client is specifically designed to handle those scenarios Since this should be a usual problem, which is the best solution for the same? Since our company is a very small one, we are looking at the least technically and financially expensive solution (say option 3 etc)? Could someone provide some pointers for the same? K

    Read the article

  • SQL Server 2008 Table Maintenance - Rebuild, Reorganize, Update Stats, Check Integrity etc HELP!

    - by Albert
    I'm migrating a ~15GB database from SQL Server 2005 to a new server running SQL Server 2008, and along with that I need to create all the new Maintenance Plans. I can take care of all the backup stuff, but the table maintenance baffles me some. Does anyone have any input on how often I should (or how often you do would suffice too) the following tasks? Check Database Integrity Rebuild Indexes Reorganize Indexes Update Statistics Shrink Database? Am I missing anything? Again if you can share how often you do these tasks that would be great...and/or share any general information about your approach to table maintenance that would be helpful. Lastly does it matter what order I run these tasks in (when setting up a job)?

    Read the article

  • SQL ConnectionString in web.config and global.asax implications

    - by rlb.usa
    This is going to sound very odd, but I have a web.config like this: <connectionStrings> <remove name="LocalSqlServer"/> <add name="LocalSqlServer" connectionString="Data Source=BACKUPDB;..." providerName="System.Data.SqlClient"/> </connectionStrings> And a global.asax like this: void Session_Start(object sender, EventArgs e) { // Code that runs when a new session is started if (Application["con"] == null || Application["con"] == "") { Application["con"] = "Data Source=PRODUCTIONDB;..."; } } And EVERYWHERE in my code, I reference my ConnectionStrings like this: SqlConnection con = new SqlConnection(Convert.ToString(HttpContext.Current.Application["con"])); However, I see that everything I do inside this application goes to BACKUP db instead of PRODUCTIONDB. What is going on, how could this happen, and why? It doesn't make any sense to me, and it got me into a lot of trouble. We use LocalSqlServer string for FormsAuthentication.

    Read the article

  • PostgreSQL 8.3 data types: xml vs varchar

    - by Sejanus
    There's xml data type in Postgres, I never used it before so I'd like to hear opinions. Downsides and upsides vs using regular varchar (or Text) column to store xml. The text I'm going to store is xml, well-formed, UTF-8. No need to search by it (I've read searching by xml is slow). This XML actually is data prepared for PDF generation with Apache FOP. XML can be generated dynamically from data found elsewhere (other Postgres tables), it's stored as is only so that I won't need to generate it twice. Kinda backup#2 for already generated PDF documents. Anything else to know? Good practices, performance, maintenance, etc?

    Read the article

  • How to script indexes, keys, foreign keys in SQL Server

    - by dontomaso
    Hi, I would like to get the details of all indexes, keys, and foreign keys from a database in SQL Server (2008). How do I do this? I plan to use this to synchronize those properties across a couple of somewhat similar databases. I can use SQL Server Management Studio, but I cannot do a full backup of a database because of restrictions set by the web hoster. -- Secondary question that you do not need to answer: Why can't there be something similar to the database schema in Mysql that simply lists all of the database structure in text SQL script format? Thanks,

    Read the article

  • Best way to auto-restore db on an houlry basis

    - by aron
    Hello, I have a demo site where anyone can login and test a management interface. Every hour I would like to flush all the data in the SQL 2008 Database and restore it from the original. Rae Gate sql has some awesome tools for this, however they are beyond my budget right now. Could I simply make a backup copy of the database's data file, then have a c# console app that deletes it and copies over the original. Then I can have a windows schedule task to run the .exe every hour. It's simple and free... would this work? I'm using SQL Server 2008 R2 Web edition I understand that red gate is technically better because I can set it to analyze the db and only update the records that were altered, and the approach I have above is like a "sledge hammer".

    Read the article

  • Converting from SQL Server 2000 to 2005 for ASP.NET web App

    - by Bazza Formez
    Hi there, I'm moving my ASP.NET website to a new provider. Only problem is, old host support my SQL Server 2000 db. New host only supports SQL Server 2005. How should I go about the conversion ? Can I simply produce a backup of the 2000 (.bak) file at the old host, and restore that file into SQL Server 2005 at the new host ? Or is there more to it ?? Note that I don't own a copy of SQL Server 2005 at home... and I'm trying to avoid having to do so. Thanks, Bazza

    Read the article

  • Replicating MySQL DB to development machine - bad idea?

    - by Joel
    I am considering replicating a production MySQL database to my development machine so I've always got current data. The production database is externally hosted. My development machine is behind an unreliable internet connection. It is entirely possible that the development machine could be disconnected from the internet for extended periods of time (hours). Would there be any adverse effect on the production database by doing this? (I don't strictly need live data - but it would be nice, and good excuse to dabble with replication. If the consensus is that this is a bad idea, I'll set up a daily job to import the previous night's backup into my development database)

    Read the article

< Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >