Search Results

Search found 3049 results on 122 pages for 'sync'.

Page 28/122 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • How do synchronize two folders in Windows 7 in real-time?

    - by acme
    I want Windows 7 to synchronize two folders in real time (maybe running a service that monitors a folder)? Basically I want to monitor a folder and synchronize each change (new files, changed files, deleted files) to another drive. It has to be in real time, so it gets synchronized instantly when a change happens. A one-direction synchronisation is enough. I tried Microsofts SyncToy, but it does only syncing by hand or scheduled. Can this be achieved with Windows 7 itself or does anyone know a freeware application for this?

    Read the article

  • Best way to syn a file between 2 or more drives?

    - by jasondavis
    I have a special file that I edit daily, it is somewhat like a large text file but a little more to it then that. I have a copy on my main desktop and a copy of the file on a USB drive as well. I would like a way to open up either file (from the USB drive or from my desktop drive) and be able to edit and save the file and have it stay updated on both drives. What is a lightweight and easy method of doing this? I do not need anything fancy

    Read the article

  • Microsoft SyncFramework - Sync different tables into one

    - by evnu
    Hello, we are trying to get the Microsoft SyncFramework running in our application to synchronize an oracle db with a mobile device. Problem The queries that we need to gather the data on the oracle db take much time (and we haven't found a way to speed them up yet), so we try to split them up in as much portions as possible. One big part of the whole problem is, that we need different information out of one big table, that bloats a query if combined. Unfortunately, the SyncFramework allows only one TableAdapter per SyncTable. Now this is a problem for our application: If we were able to use more than one TableAdapter per SyncTable, we could easily spread the queries in a more efficient way. Using one query per Table which combines all the needed data takes way too much time. Ideas I thought of creating different TableAdapters for each one of the required queries and then merge the resulting datasets afterwards (preferably on the server). This seems to work, but is a rather awkward solution. Does someone of you know a better solution? Or do you have some ideas that could help? Thanks in advance, evnu EDIT: So, I implemented the merge solution. If you are interested, take a look at the following code. I'll give more details if there are questions. <WebMethod()> _ Public Function GetChanges(ByVal groupMetadata As SyncGroupMetadata, ByVal syncSession As SyncSession) As SyncContext Dim stream As MemoryStream Dim format As BinaryFormatter = New BinaryFormatter Dim anchors As Dictionary(Of String, Byte()) ' keep track of the tables that will be updated Dim addTables As Dictionary(Of String, List(Of SyncTableMetadata)) = New Dictionary(Of String, List(Of SyncTableMetadata)) ' list of all present anchors Dim allAnchors As Dictionary(Of String, Byte()) = New Dictionary(Of String, Byte()) ' fill allAnchors - deserialize all given anchors For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata If Table.LastReceivedAnchor Is Nothing Or Table.LastReceivedAnchor.IsNull Then Continue For stream = New MemoryStream(Table.LastReceivedAnchor.Anchor) anchors = format.Deserialize(stream) For Each item As KeyValuePair(Of String, Byte()) In anchors allAnchors.Add(item.Key, item.Value) Next stream.Dispose() Next For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata If allAnchors.ContainsKey(Table.TableName) Then Table.LastReceivedAnchor.Anchor = allAnchors(Table.TableName) End If Dim addSyncTables As List(Of SyncTableMetadata) If syncSession.SyncParameters.Contains(Table.TableName) Then Dim tableNames() As String = syncSession.SyncParameters(Table.TableName).Value.ToString.Split(":") addSyncTables = New List(Of SyncTableMetadata) For Each tableName As String In tableNames Dim newSynctable As SyncTableMetadata = New SyncTableMetadata newSynctable.TableName = tableName If allAnchors.ContainsKey(tableName) Then Dim anker As SyncAnchor = New SyncAnchor(allAnchors(tableName)) newSynctable.LastReceivedAnchor = anker Else newSynctable.LastReceivedAnchor = Nothing End If newSynctable.SyncDirection = Table.SyncDirection addSyncTables.Add(newSynctable) Next addTables.Add(Table.TableName, addSyncTables) End If Next ' add the newly created synctables For Each item As KeyValuePair(Of String, List(Of SyncTableMetadata)) In addTables For Each Table As SyncTableMetadata In item.Value groupMetadata.TablesMetadata.Add(Table) Next Next ' fire queries Dim context As SyncContext = servSyncProvider.GetChanges(groupMetadata, syncSession) ' merge resulting datasets For Each item As KeyValuePair(Of String, List(Of SyncTableMetadata)) In addTables For Each Table As SyncTableMetadata In item.Value If context.DataSet.Tables.Contains(Table.TableName) Then If Not context.DataSet.Tables.Contains(item.Key) Then Dim tmp As DataTable = context.DataSet.Tables(Table.TableName).Copy tmp.TableName = item.Key context.DataSet.Tables.Add(tmp) Else context.DataSet.Tables(item.Key).Merge(context.DataSet.Tables(Table.TableName)) context.DataSet.Tables.Remove(Table.TableName) End If End If Next Next ' create new anchors Dim allAnchorsDict As Dictionary(Of String, Byte()) = New Dictionary(Of String, Byte()) For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata allAnchorsDict.Add(Table.TableName, context.NewAnchor.Anchor) Next stream = New MemoryStream format.Serialize(stream, allAnchorsDict) context.NewAnchor.Anchor = stream.ToArray stream.Dispose() Return context End Function

    Read the article

  • Glassfish v2 alternatedocroot - will DAS sync it?

    - by ring bearer
    Using Sun Glassfish Enterprise server v2.1.1 I am using "alternatedocroot" via sun-web.xml for my web application to abstract out static content from actual deploy-able code (EAR/WAR) What I have is a cluster of two server instances distributed across two physical hosts - HOST1 and HOST2. "alternatedocroot" points to /data/static-content/ on both HOST1 and HOST2. Would DAS (Domain application server )take care of syncing /data/static-content between HOST1 and HOST2 if I use syncinstances=true option while starting up the cluster? Thanks!

    Read the article

  • Sync Vs. Async Sockets Performance in C#

    - by Michael Covelli
    Everything that I read about sockets in .NET says that the asynchronous pattern gives better performance (especially with the new SocketAsyncEventArgs which saves on the allocation). I think this makes sense if we're talking about a server with many client connections where its not possible to allocate one thread per connection. Then I can see the advantage of using the ThreadPool threads and getting async callbacks on them. But in my app, I'm the client and I just need to listen to one server sending market tick data over one tcp connection. Right now, I create a single thread, set the priority to Highest, and call Socket.Receive() with it. My thread blocks on this call and wakes up once new data arrives. If I were to switch this to an async pattern so that I get a callback when there's new data, I see two issues The threadpool threads will have default priority so it seems they will be strictly worse than my own thread which has Highest priority. I'll still have to send everything through a single thread at some point. Say that I get N callbacks at almost the same time on N different threadpool threads notifying me that there's new data. The N byte arrays that they deliver can't be processed on the threadpool threads because there's no guarantee that they represent N unique market data messages because TCP is stream based. I'll have to lock and put the bytes into an array anyway and signal some other thread that can process what's in the array. So I'm not sure what having N threadpool threads is buying me. Am I thinking about this wrong? Is there a reason to use the Async patter in my specific case of one client connected to one server?

    Read the article

  • Login From Multiple Services, Keeping Profiles in Sync

    - by viatropos
    Given the following: I have an application that allows people to login through twitter, myspace, yahoo, and google User creates initial account by logging in through Google User logs out User logs back in using Yahoo. ...is there a recommended way for the application to associate those two accounts together? Stack Overflow has this functionality but it seems like they need the user to manually say "this account google account is associated with that yahoo one". Is there no way to do this automatically?

    Read the article

  • Unable to sync custom authentication with RIA services in SL3 + RIA implementation

    - by Nair
    I am developing SL3 + RIA services with custom authentication. I followed the example in http://code.msdn.microsoft.com/RiaServices/Release/ProjectReleases.aspx?ReleaseId=2661 to implement custom authentication. Based on the implementation, you first do login request from client to service. This request is async process. Noe the client GUI will start to bind data to SL controls using RIA services, which requires the authentication to be successful (by adding [RequireAuthentication] attribute). The trouble is, since you requested login from the main process, while it is doing authentication, the page control takes over and starts to bind data using RIA services. But the problem is authentication is not completed yet thus which ever the first service method data binding hits will fail with 'Access denied'. Bottom line is GUI is will not wait for authentication to be completed to start the data binding. My question is how do you handle this situation? Thanks,

    Read the article

  • able to get the events in fullcalender, but not able to sync these events with time

    - by Ubaid
    i just loved the fullcalendar and wanted to implement it in a small application, everythin worked OK. i am able to get the events from my database through json to the front end. but all events are being listed as "ALL-DAY" events itself. not able to figure out why.. here is the screenshot for the same. any ideas what is going wrong..? i am using asp.net and c#. i have already tried sending the start and end dates in the ToString(), ToShortDateString(), ToString("s"), ToLongDateString(), ToUniversalTime(). nothing seems to working for me at the moment. i tried hard coding and sendin the data too. sample json of my data [{ "id": "2", "title": "Event2", "start": "1274171700", "end": "1274175600" }, { "id": "1", "title": "Event1", "start": "5/18/2010 16:30:00", "end": "5/18/2010 19:30:00" }, { "id": "3", "title": "Event3", "start": "5/18/2010 2:05:00 PM", "end": "5/18/2010 3:10:00 PM" }, { "id": "4", "title": "Event4", "start": "5/18/2010", "end": "5/18/2010" }, { "id": "5", "title": "Event5", "start": "2010-05-18T14:05:00", "end": "2010-05-18T15:10:00" }] all data above has different formats of dates, and at the moment nothing seems to be working. fullcalender accepts the day part fine, but not the time part. not sure why. can anybody help?

    Read the article

  • Synchronizing two SQL Server databases using MS Sync Framework

    - by Immortal
    I have one central SQL Server database which can be offline from time to time. I have a desktop application using Local DB Cache (SQL CE) to synchronize with the central database and I also have a web application with its own SQL Server that I'd also would like to keep synchronized. All synchronizations must be bidirectional. Is there a way to synchronize my central database with web application's database in the same way as I synchronize my central database with desktop client? I know about collaboration scenarios and peer-to-peer synchronization but I would like to avoid manual provisioning of databases. I'd like to use integrated sql server 2008 change tracking just like in the SQL CE <-- SQL Server scenario.

    Read the article

  • Git repository is out of sync after rebase

    - by Keyo
    I have squashed 2 commits (A and B) into one new commit (C). The previous two commits (A and B) where removed. I pushed these commits from my development repo to a central(bare) repository. The git-log on both repos confirms that commits A and B have been removed. The problem is when I do a pull on a third repository which already had (A and B) it now has all three commits (A, B and C). I would have thought the pull would synchronise these changes. Do I need to checkout A~1 and then merge in the new changes? This seems like a hassle, especially in a production environment.

    Read the article

  • TFS and Project portal sync

    - by Tigran
    Just installed TFS 2008. Created a project with portal site. When I create a bug in TFS from Visual Studio I'd like to see that bug in the web portal. And the other way around. When I add a task in the web portal I'd like to see that in Visual Studio. It's as if the web portal is not connected to TFS. I'm using the MSF for Agile Software Development template. I just saw Team System Web Access. That's what I want in addition to the other pages Sharepoint portal provides. For example the Wiki page and calendar view. So one place to see all tasks lists and assign task and see other project related pages. Maybe be able to has the tasks due dates come up on the calendar. Many thanks

    Read the article

  • Sync Vs. Async Sockets Performance in .NET

    - by Michael Covelli
    Everything that I read about sockets in .NET says that the asynchronous pattern gives better performance (especially with the new SocketAsyncEventArgs which saves on the allocation). I think this makes sense if we're talking about a server with many client connections where its not possible to allocate one thread per connection. Then I can see the advantage of using the ThreadPool threads and getting async callbacks on them. But in my app, I'm the client and I just need to listen to one server sending market tick data over one tcp connection. Right now, I create a single thread, set the priority to Highest, and call Socket.Receive() with it. My thread blocks on this call and wakes up once new data arrives. If I were to switch this to an async pattern so that I get a callback when there's new data, I see two issues The threadpool threads will have default priority so it seems they will be strictly worse than my own thread which has Highest priority. I'll still have to send everything through a single thread at some point. Say that I get N callbacks at almost the same time on N different threadpool threads notifying me that there's new data. The N byte arrays that they deliver can't be processed on the threadpool threads because there's no guarantee that they represent N unique market data messages because TCP is stream based. I'll have to lock and put the bytes into an array anyway and signal some other thread that can process what's in the array. So I'm not sure what having N threadpool threads is buying me. Am I thinking about this wrong? Is there a reason to use the Async patter in my specific case of one client connected to one server?

    Read the article

  • Sync between ms exchange calendar and Java app

    - by arkadiy
    Hi, we are developing a CRM app which holds customer meeting info. Users have requested that their Outlook calendars should reflect the activity they have booked in the CRM application and vice versa. Is there any solution to achieve this? Preferably not using any plugins or installs on the end user's PC?

    Read the article

  • How does Ubuntu One sync two machines with identical file content?

    - by user27449
    I have a notebook and a desktop computer, both running Ubuntu 11.10. I used to sync between the two with the help of Unison, so both computers have identical content in the Documents folder. I decided to try UbuntuOne. My question is, if I activate UbuntuOne for the two machines for the folders with identical contents, will UbuntuOne be able to recognise that, or will it sync to the cloud everything twice (and then down on the other machine). To put it another way, will I end up having two copies of everything on the machines and on the cloud, and therefore should delete the identical files on one of the machines before activating UbuntuOne, or not. Thank you, and if there is already something on the net about this, I'd be glad if somebody posted the link here.

    Read the article

  • Combining streams on FMS: sync and unify

    - by yn2
    Hi folks, I was wondering if this can be done easily (or at least "can be done"). I have several live streams from different users - all being served by an FMS server for online talk. We are recording every incoming stream. What we want is to join them in some way so we could have a single file, synced, combined from several incoming streams (or recorded flv files). Any ideas?

    Read the article

  • Thinking Sphinx - sorting by a string attribute gets out of sync when changes are made

    - by Scott Brown
    I have a "restaurants" table with a "name" column. I've defined the following index: indexes "REPLACE(UPPER(restaurants.name), 'THE ', '')", :as => :restaurant_name, :sortable => true ... because I want to sort the restaurant names without respect to the prefix "The ". My problem is that whenever one of these records is updated (in any way) the new record jumps to the top of the sort order. If another record is updated, it also jumps ahead of the rest. I end up with two lists: a list of restaurants that have been updated since the last re-indexing and a list of those that haven't. Each respective list is in alphabetical order, but I don't understand why the overall list is getting segregated this way. I do have a delayed delta index set up, and I assume the issue is related to this.

    Read the article

  • Table-level diff and sync procedure for T-SQL

    - by Ville Koskinen
    I'm interested in T-SQL source code for synchronizing a table (or perhaps a subset of it) with data from another similar table. The two tables could contain any variables, for example I could have base table source table ========== ============ id val id val ---------- ------------ 0 1 0 3 1 2 1 2 2 3 3 4 or base table source table =================== ================== key val1 val2 key val1 val2 ------------------- ------------------ A 1 0 A 1 1 B 2 1 C 2 2 C 3 3 E 4 0 or any two tables containing similar columns with similar names. I'd like to be able to check that the two tables have matching columns: the source table has exactly the same columns as the base table and the datatypes match make a diff from the base table to the source table do the necessary updates, deletes and inserts to change the data in the base table to correspond the source table optionally limit the diff to a subset of the base table, preferrably with a stored procedure. Has anyone written a stored proc for this or could you point to a source?

    Read the article

  • Sync Framework: SqlSyncProvider ItemConflicting vs ApplyChangeFailed

    - by Paul Smith
    I'm trying to use design a syncronisation application that syncs changes between different SQL Server databases. I came up with a design based around receiving the ItemConflicting event, storing the knowledge associated with the conflict, and resolving all conflicts off-line. However, it seems that I can only get the ApplyChangeFailed event to fire. Is there some reason why SqlSyncProvider does not use the ItemConflicting event? Am I just hooking up to the event wrongly? The reason I care is that the ItemConlficting event allows me to simply log the conflict and continue with the rest of the synchronisation in a way that I can't seem to achieve with the ApplychangeFailed event.

    Read the article

  • Specifying path in a preference file in Unison

    - by Curious2learn
    I am using Unison to sync a folder on my local computer with one on a server. I am using a unison preference file for this. If I specify the path of the local folder using Method 1 (see below) things work well. But Method 2 does not work. I would like to use something like Method 2 because I want to use the same preference file (synced using Dropbox) on two different computers. But the usernames are different on these two computers and I cannot change the usernames. Any ideas how this can be achieved? Thank you. METHOD 1 (works) root = /Users/username1/Dropbox/path_to_folder/folder METHOD 2a (Does not work) root = ~/Dropbox/path_to_folder/folder METHOD 2b (Does not work) root = $HOME/Dropbox/path_to_folder/folder

    Read the article

  • Free tools/libraries to compare tables with filtering in different databases and visualize/sync diff

    - by MicMit
    I am building certain GUI in C# for a content manager and looking for the tools or code snippets or open libraries ( code ideally in C# ) which allow me the following : 1. For table A in database X (test ) and table A in database Y (production) and for a simple filter ( e.g. listname = "XYZ" ) I need to show additions/deletions/updates in some way. which might be side-by-side or just html report 2 record added html table with some fields 2 record deleted html table with some fields Considering that this task is very common, I guess, certain components should exist ? Components either return some collections from parameters given for further visualizing or just produce reports mentioned above. 2. I need to push changes for the filter I mentioned in 1 and update table in production database for this filter only ( ie for the particular list approved by content person). Again probably there are certain SQL code generators - components in addition to diffs or standalone. 3. The key thing tools/libraries - should be suitable for integration with the existing application in C#.

    Read the article

  • data sync from rails with iphone sqlite db

    - by Markus
    Hi all, I'm wondering whether there is a bossibility to export some selected data from my rails mysql db to a other sqlite db. The aim is to send that sqlite file directly to my iPhone application... That way I don't have to do a lot of xml integration in the iPhone app. That seems to be very slow... Markus

    Read the article

  • AIR: sync gui with data-base?

    - by John Isaacks
    I am going to be building an AIR application that shows a list (about 1-25 rows of data) from a data-base. The data-base is on the web. I want the list to be as accurate as possible, meaning as soon as the data-base data changes, the list displayed in the app should update asap. I do not know of anyway that the air application could be notified when there is a change, I am thinking I am going to have to poll the data-base at certain intervals to keep an up to date list. So my question is, first is there any way to NOT have to keep checking the data-base? or if I do keep have to keep checking the data-base what is a reasonable interval to do that at? Thanks.

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >