Search Results

Search found 13175 results on 527 pages for 'live backup'.

Page 36/527 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • weird problem with load () or live () !!

    - by silversky
    I load a page with load () and then I create dinamically a tag. Then I use live() to bind a click event and fires a function. At the end a call unload (). The problem is that when I load the same page again ( without refresh ) when on click the function will be fired twice. If I exit again (again with unload ()) and load the page again on click will fire 3 times and so on .... A sample of my code is: $('#tab').click(function() { $('#formWrap').load('newPage.php'); }); $('div').after('<p class="ctr" ></p>'); $('p.ctr').live('click', function(e) { if($(e.target).is('[k=lf]')) { console.log ('one'); delete ($this); } else if .... }); function delete () { $.post( 'update.php', data); } I have other $.post inside on this page and also on the above live fnc and all work well. The above one also works but like I said on the second load will fire twice and the 3 times and so on ... The weird part for me is that if replece the console with console.log ('two'); save the page and load the page without refresh it will fire on a different rows - one two - if I unload the page replace the console with console.log ('three'); and load again will fire one two and three. I try to use: $.ajax({ url: 'updateDB.php', data: data, type: 'POST', cache:false }); $.ajaxSetup ({ cache: false }); header("Cache-Control: no-cache"); none of this it's working. And I have this problem only on this fnc. What do you think, it could be the reason, it remembers it remembers the previous action and it fires again?

    Read the article

  • How to use jQuery .live() with ajax

    - by kylemac
    Currently I am using John Resig's LiveQuery plugin/function - http://ejohn.org/blog/jquery-livesearch/ - to allow users to sort through a long unordered-list of list-items. The code is as follows: $('input#q').liveUpdate('ul#teams').focus(); The issue arises when I use ajaxified tabs to sort the lists. Essentially I use ajax to pull in different lists and the liveUpdate() function doesn't have access to the new li's. I assume I would need to bind this using the .live() function - http://api.jquery.com/live/. But I am unclear how to bind this to an ajax event, I've only used the "click" event. How would I bind the new liveUpdate() to the newly loaded list-items? EDIT: The ajax tabs is run through the wordpress ajax api so the code is fairly complex, but simplified it is something like this: $('div.item-list-tabs').click( function(event) { var target = $(event.target).parent(); var data = {action, scope, pagination}; // Passes action to WP that loads my tab data $.post( ajaxurl, data, function(response) { $(target).fadeOut( 100, function() { $(this).html(response); $(this).fadeIn(100); }); }); return false; }); This is simplified for the sake of this conversation, but basically once the $.post loads the response in place .liveUpdate() doesn't have access to it. I believe the .live() function is the answer to this problem, I'm just unclear on how to implement it with the $.post()

    Read the article

  • live.com setting can't be changed

    - by M M
    I'm on mail.live.com where, in the upper left corner it says "Windows Live™" and to the right of that it says "Hotmail([number])" "Messenger" "SkyDrive" "|" "MSN." Directly under the "Windows Live™," there is a square, bluish/gray avatar (with a generic, rotund peop with a head and trunk and arms). To the right of that there is a field (with a subtle, barely perceptible speech bubble-like arrow emanating from the avatar). But there's a word inside that field that I cannot get rid of. Coincidentally I think it's the same word I used as a search term a while back, having meant to put the search term in the "Search email and more" bing field on the other side of the screen. (Even that would have been by mistake because I had been aiming for the e-mail search field.) But it remains in the field connected to the avatar--and moreover the field is editable to the limited extent that a cursor can be placed into the field with a mouse click; the word just can't be deleted. I don't know if the avatar should be there either, but I'd rather have just simply that than the word next to it continuously there for time immemorial. If I click into the field hoping to delete the word, I'm confronted with options along the bottom of the same field (now expanded by my mouse click): "Add: Photo Link Document," and a button that says "Share" and an [X] to reduce the field back to its default state--which still contains the word I'm trying to delete.

    Read the article

  • SQL Server transaction log backups,

    - by krimerd
    Hi there, I have a question regarding the transaction log backups in sql server 2008. I am currently taking full backups once a week (Sunday) and transaction log backups daily. I put full backup in folder1 on Sunday and then on Monday I also put the 1st transaction log backup in the same folder. On tuesday, before I take the 2nd transaction log backup I move the first transaction log backup from folder1 an put it into folder2 and then I take the 2nd transaction log backup and put it in the folder1. Same thing on Wed, Thurs and so on. Basicaly in folder1 I always have the latest full backup and the latest transaction log backup while the other transaction log backups are in folder2. My questions is, when sql server is about to take, lets say 4th (Thursday) transaction log backup, does it look for the previous transac log backups (1st, 2nd, and 3rd) so that this new backup will only include the transactions from the last backup or it has some other way of knowing whether there are other transac log backups. Basically, I am asking this because all my transaction log backups seem to be about the same size and I thought that their size will depend on the amount of transactions since the last transaction log backup. Can anyone please explain if my assumptions are right? Thanks...

    Read the article

  • NTDS Replication Warning (Event ID 2089)

    - by Chris_K
    I have a simple little network with 3 AD servers in 2 sites. Site A has Win2k3 SP2 and Win2k SP4 servers, site B has a single Win2k3 SP2 server. All have been in place for at least 3 years now. Just last week I started getting Event 2089 "not backed up" warnings (example below) on both of the win2k3 servers. I understand what the message means, no need to send me links to the technet article explaining it. I'll improve my backups. What I'm more curious about is why did I just start getting this message now? Why haven't I been getting it for the past 3 years?!? Perhaps this is related: I recently decommissioned a few other sites and AD controllers (there used to be 3 more sites, each with their own controller). Don't worry, I did proper DCpromo exercises and made sure we didn't lose anything. But would shutting those down possibly be related to why I get this error now? This won't keep me awake at night but I am curious as to what changed... Event Type: Warning Event Source: NTDS Replication Event Category: Backup Event ID: 2089 Date: 3/28/2010 Time: 9:25:27 AM User: NT AUTHORITY\ANONYMOUS LOGON Computer: RedactedName Description: This directory partition has not been backed up since at least the following number of days. Directory partition: DC=MyDomain,DC=com 'Backup latency interval' (days): 30 It is recommended that you take a backup as often as possible to recover from accidental loss of data. However if you haven't taken a backup since at least the 'backup latency interval' number of days, this message will be logged every day until a backup is taken. You can take a backup of any replica that holds this partition. By default the 'Backup latency interval' is set to half the 'Tombstone Lifetime Interval'. If you want to change the default 'Backup latency interval', you could do so by adding the following registry key. 'Backup latency interval' (days) registry key: System\CurrentControlSet\Services\NTDS\Parameters\Backup Latency Threshold (days) For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

    Read the article

  • How do I restore a backup of my keyring (containing ssh key passprases, nautilus remote filesystem passwords and wifi passwords)?

    - by con-f-use
    I changed the disk on my laptop and installed Ubuntu on the new disk. Old disk had 12.04 upgraded to 12.10 on it. Now I want to copy my old keyring with WiFi passwords, ftp passwords for nautilus and ssh key passphrases. I have the whole data from the old disk available (is now a USB disk and I did not delete the old data yet or do anything with it - I could still put it in the laptop and boot from it like nothing happend). The old methods of just copying ~/.gconf/... and ~/.gnome2/keyrings won't work. Did I miss something? 1. Edit: I figure one needs to copy files not located in the users home directory as well. I copied the whole old /home/confus (which is my home directory) to the fresh install to no effect. That whole copy is now reverted to the fresh install's home directory, so my /home/confus is as it was the after fresh install. 2. Edit: The folder /etc/NetworkManager/system-connections seems to be the place for WiFi passwords. Could be that /usr/share/keyrings is important as well for ssh keys - that's the only sensible thing that a search came up with: find /usr/ -name "*keyring* 3. Edit: Still no ssh and ftp passwords from the keyring. What I did: Convert old hard drive to usb drive Put new drive in the laptop and installed fresh version of 12.10 there Booted from old hdd via USB and copied its /etc/NetwrokManager/system-connections, ~/.gconf/ and ~/.gnome2/keyrings, ~/.ssh over to the new disk. Confirmed that all keys on the old install work Booted from new disk Result: No passphrase for ssh keys, no ftp passwords in keyring. At least the WiFi passwords are migrated.

    Read the article

  • How to customize live Ubuntu CD with my own branding?

    - by Ahash
    I would like to create a customized UbuntuOS (Live CD/DVD) for my office and house by installing some additional packages. I have followed this link but it doesn't seems to work. Can anyone provide clear instructions? Customize Packages that I want to install: KDE Desktop Enviornment Thunderbird VLC Player WIne Programme Loader Skype Playmouth Manager Super Boot Manager Synaptic Package Manager Changes that I need: Different default Ubuntu wallpaper Installing KDE Environment Changing Boot (Splash) Screen with my customize theme I want to installing ubuntu With 2 Language, Bangla & English Please Note, I do not prefer Remastersys, manual way will be appreciated.

    Read the article

  • 'unknown filesystem' grub rescue prompt; trying to wipe drive and boot 10.10 live

    - by Patrick
    Im currently running Win7, and want to wipe the drive and install 10.10. I have 10.10 loaded on a USB thumbdrive and it sees the device in BIOS but it only reaches a screen saying; Unknown Filesystem grub rescue> Ive read several results from google and a couple here where people are trying to dual boot and i assume save the data on the drive, but i dont care about doing that, and would prefer to just wipe the drive and start fresh. What steps can i take to get the drive to a point where i can load 10.10 live and get it installed?

    Read the article

  • Amazon EC2 EBS automatic backup one-liner works manually but not from cron

    - by dan
    I am trying to implement an automatic backup system for my EBS on Amazon AWS. When I run this command as ec2-user: /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** everything works fine. But if I add this line into /etc/crontab and restart the crond service: 15 12 * * * ec2-user /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** that doesn't work. I checked var/log/cron and there is this line, therefore the command gets executed: Dec 13 12:15:01 ip-10-204-111-94 CROND[4201]: (ec2-user) CMD (/opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** ) Can you please help me to troubleshoot the problem? I guess is some environment problem - maybe the lack of some variable. If that's the case I don't know what to do about it. Thanks.

    Read the article

  • Using rsync to take backup of folder

    - by Ali
    Hi, I have a server (Linux) with NAS which is mounted as folder "mount" I have website in "public_html" folder. I want to take backup of website in mount folder automatically at certain intervals for e.g. every hour. I read that there is something called "rsync" which is used to make two folders sync. And it doesn't copy all files every time and instead matches if the file has been changed and then only update changed files. How do I use it to make automatic backups? I have root access to server. Thanks

    Read the article

  • How to schedule daily backup in SQL Server 2008 Web Edition

    - by Xenon
    In SQL Server Management Studio I created a maintenance plan but it won't work Error is; "Message Executed as user: LITESPELL-19C34\Administrator. Microsoft (R) SQL Server Execute Package Utility Version 10.0.1600.22 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. The SQL Server Execute Package Utility requires Integration Services to be installed by one of these editions of SQL Server 2008: Standard, Enterprise, Developer, or Evaluation. To install Integration Services, run SQL Server Setup and select Integration Services. The package execution failed. The step failed." But in Microsoft page http://www.microsoft.com/sqlserver/2008/en/us/web.aspx in Automate tasks and policies section it is written that backup can be scheduled in this edition but how?

    Read the article

  • How to schedule daily backup in MSSQL Server 2008 Web Edition

    - by Xenon
    In MSSQL Management Studio I created a maintenance plan but it won't work Error is; "Message Executed as user: LITESPELL-19C34\Administrator. Microsoft (R) SQL Server Execute Package Utility Version 10.0.1600.22 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. The SQL Server Execute Package Utility requires Integration Services to be installed by one of these editions of SQL Server 2008: Standard, Enterprise, Developer, or Evaluation. To install Integration Services, run SQL Server Setup and select Integration Services. The package execution failed. The step failed." But in Microsoft page http://www.microsoft.com/sqlserver/2008/en/us/web.aspx in Automate tasks and policies section it is written that backup can be scheduled in this edition but how?

    Read the article

  • Backup AWS Dynamodb to S3

    - by Ali
    It has been suggested on Amazon docs http://aws.amazon.com/dynamodb/ among other places, that you can backup your dynamodb tables using Elastic Map Reduce, I have a general understanding of how this could work but I couldn't find any guides or tutorials on this, So my question is how can I automate dynamodb backups (using EMR)? So far, I think I need to create a "streaming" job with a map function that reads the data from dynamodb and a reduce that writes it to S3 and I believe these could be written in Python (or java or a few other languages). Any comments, clarifications, code samples, corrections are appreciated.

    Read the article

  • Exchange 2010 and ESE Backup API

    - by Hannes de Jager
    Exchange 2010 does not support the ESE API for doing backups like it did in 2003 and 2007 according to MSDN. I Quote: "Exchange 2010 no longer supports the ESE streaming APIs for backup and restore of program files or data. Instead, Exchange 2010 supports only VSS-based backups." So my question is, if this is the case, why is the DLL (ESEBCLI2.DLL) still shipped with exchange 2010? I found it under C:\Program Files\Microsoft\Exchange Server\V14\Bin. Am I missing something here?

    Read the article

  • Using openssl encryption for Apple's HTTP Live Streaming

    - by Rob
    Has anyone had any luck getting encrypted streaming to work with Apple's HTTP Live Streaming using openssl? It seems I'm almost there but my video doesn't play but I don't get any errors in Safari either (like "Video is unplayable" or "You don't have permission to play this video" when I got the key wrong). #bash script: keyFile="key.txt" openssl rand 16 > $keyFile hexKey=$(cat key.txt | hexdump -e '"%x"') hexIV='0' openssl aes-128-cbc -e -in $fileName -out $encryptedFileName -p -nosalt -iv ${hexIV} -K ${hexKey} #my playlist file: #EXTM3U #EXT-X-TARGETDURATION:000020 #EXT-X-MEDIA-SEQUENCE:0 #EXT-X-KEY:METHOD=AES-128,URI="key.txt" #EXTINF:20, no desc test.ts.enc #EXT-X-ENDLIST I was using these docs as a guide: http://tools.ietf.org/html/draft-pantos-http-live-streaming

    Read the article

  • Scaled live iPhone Camera view in center, "CGAffineTransformTranslate" not working

    - by Gavin
    Hi, I have a little problem which I could not solve. I really hope someone can help me with that. I wanted to resize the live camera view and place it in the center, using the following code below: picker.cameraViewTransform = CGAffineTransformScale(picker.cameraViewTransform, 0.5, 0.56206); picker.cameraViewTransform = CGAffineTransformTranslate(picker.cameraViewTransform, 80, 120); But all I got was a scaled 1/2 sized view on the top left of the screen. It seems as though "CGAffineTransformTranslate" does nothing at all. The translation didn't work even when I used: picker.cameraViewTransform = CGAffineTransformMake(1, 0, 0, 1, 80, 120); The translation portion seems to have no effect on the live camera view. Hope someone can enlighten me. Thanks.

    Read the article

  • How to set up a centralized backup server with lots of offsite workstations, intermittent internet connectivity, and stubborn users?

    - by Zac B
    This might be an impossible question. Context: We have a bunch of computers across around 1000 users. We have a centralized office where 900 of the users work, most of the time. Most of the computers are laptops. They are very frequently coming on and off the network for hours at a time. Users often take their computers home and do lots of work from home. In addition, there are a handful of users who work elsewhere in the country, who are offline (no internet connection whatsoever) for more than half of the time they use their machines. All of the machines are Windows 7/XP. Problem: People are always losing data. One day someone accidentally deletes a bunch of files. The next day someone else installs a bad driver or tries to mess with something in system32 and needs a personal data backup/reinstall of Windows. Because of how many of our business operations are done without an internet connection, and how frequently computers come on- and offline, it's unfeasible to make users use network storage for all of their data. We tried giving them Dropboxes, and they stored their files elsewhere. We bought and deployed Altiris, and they uninstalled it and blamed us when they couldn't get files back that they accidentally deleted while they were offline and hadn't taken a backup in months. We tried teaching them backup best-practices, and using scheduled sync tools to upload things to the network drives, and they turned them off because they "looked like viruses". It doesn't help that many of these users are pretty high up in the business and are not amicable to any sort of "you need to do something regularly because we say so" solution. Question: Other than finding another job where IT is treated differently and users are willing to follow best practices, how would people recommend I implement a file backup solution that supports the following: Backs up to a centralized server over LAN or WAN whenever a network link becomes available, or on a schedule. Supports interrupted/resumed backups (and hopefully file-delta only backups), since connections to the network (WAN or LAN) are often slow and only open for half an hour or so. Supports relatively rapid, "I accidentally deleted the TPS reports! Oh no!" single-file recovery, ideally administered from the central backup server rather than the client PC. Supports local-to-local file delta backup on a schedule, so that users without a network connection for a few days can still retrieve accidental deletions or whatnot. Ideally, the local stored backups would be pushed up to the server whenever network link is available. Isn't configurable on the clients without certain credentials. Because the CFOs (who won't give up their admin rights on the domain) will disable it if they can. Backs up the entire hard drive. There are people who are self-righteous about storing things in C:\, or in the recycle bin, or in the C:\Windows dir (yes, I know). I'm fine integrating multiple products/solutions, or scripting different programs together myself (I'm a somewhat competent programmer), but I've been drawing a blank on where to start. Dropbox is folder-specific, Altiris doesn't cope with LAN outages or interrupted/resumed backups, Volume Shadow Copy is awesome for a local-to-local solution, but I don't know how to push days of stored shadow copies up to a server in a 2 hour window of network access. The company is fine with spending decent money on this, thousands (USD) on a server, and hundreds on clients, if necessary. I want to emphasize that this isn't a shopping list request. While I wish there was a program out there that did what I want, I've looked pretty hard, and not found anything that fits the bill. Instead, I'm hoping for ideas on where to start hacking things together from scratch/from different technologies to make something stable that works. Cheers!

    Read the article

  • iPhone Live Video Stream Media Player

    - by happyhammer83
    I'm hoping to make an app that streams live video that has a view placed on top with labels and a button on it. From my research and testing of the http video streaming feature (available since iPhone 3.0 OS), it seems that you create a webview that points to the index html that contains the converted video stream, and this displays as a quicktime video in the app. This means that I don't have control over the Media Player that is opened. Does anyone know how you can control this? I know that the Apple's MoviePlayer sample code shows you how to place views on top of a MediaPlayer video, but how can this be done with a http live stream? Thanks in advance.

    Read the article

  • WPF integrate Windows live authentication for windows health vault

    - by AnD
    Hi all, I'm just wondering if there's any way for WPF application integrated with windows live ID? and it's actually for windows health vault [www.healthvault.com] so health vault is using windows live id or open id to login into their system. and what i gonna do is, creating wpf application (instead of web application) for health vault, so all of the login form username pass and everything is handled inside the wpf application without showing/using any internet browser. so since this's quite new for me, i hope if there's somebody ever did this before especially for health vault system that run on standalone wpf app. alright, so that's it, thank you in advance!

    Read the article

  • WPF integrate Windows live authentication for Microsoft health vault

    - by AnD
    Hi all, I'm just wondering if there's any way for WPF application integrated with windows live ID? and it's actually for windows health vault [www.healthvault.com] so health vault is using windows live id or open id to login into their system. and what i gonna do is, creating wpf application (instead of web application) for health vault, so all of the login form username pass and everything is handled inside the wpf application without showing/using any internet browser. so since this's quite new for me, i hope if there's somebody ever did this before especially for health vault system that run on standalone wpf app. alright, so that's it, thank you in advance!

    Read the article

  • HTML Link and Jquery Live Only Works on First Try with one click

    - by Jon
    Hi Everyone, I'm running into a problem with jquery live event binding on a link. When the link is added to the page it works fine, however when another link is added to the unordered list it requires two clicks for the click event to fire on either link. Any ideas? $("div#website-messages ul li a").live("click", function() { var link = $(this); changeTab(link.attr("href")); $(link.attr("title")).focus(); return false; });

    Read the article

  • IE 7 can't bind event (using .live()) to dynamically created element using .load()

    - by petron
    Hi All - I'm having trouble getting IE7 to keep a click event bound to an element that is added to the DOM using .load(). Here's some code: $('.mybtn').live('click', function(e){ e.preventDefault(); $('#mypage').load('load-this-page.htm'); }); And here's the html <div id="mypage"> <a href="#" class="mybtn">clickme</a> // stuff goes here </div> On page load the click works but once the div is loaded via the clickme link the click stops working in IE7. The clickme link is within the div on load and also within the load() loaded html file that's why I'm using live(). This code works in FF 3.6, fyi. Anyone have any idea what's up (besides the fact the IE sucks balls)? Thanks!

    Read the article

  • Make backup of large site with 100,000+ files/images

    - by niggles
    I tried backing up our site today using the Unix 'cp' command and ended up getting our office blocked out by PLESK - it added my ip to /etc/hosts.deny as it thought I was flooding the server. After Tech support fixed the issue, they suggested I go folder by folder to back it up, but there's about 10,000 folders on the site totaling 1/2 a terabyte, each with multiple sub-folders, so this isn't viable. Basically I want to be able to mirror the domain on another domain we've got set up on the same dedicated server so I can test with live images (the bulk of our content). Any suggestions e.g adding some rules to open_base_dir and getting PHP to recursively copy the folders to the other domain (remember it's on the same dedicated box so it just needs to traverse the directory, not FTP things).

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >