Search Results

Search found 65464 results on 2619 pages for 'web based backup'.

Page 229/2619 | < Previous Page | 225 226 227 228 229 230 231 232 233 234 235 236  | Next Page >

  • Where's the best place to find good senior web developers?

    - by bokani
    We are looking for a senior web developer for a business start up based in London Mayfair? • Demonstrable experience developing Web 2.0 projects • Complete fluency in HTML, Javascript, CSS, php and MySQL • Experience of jQuery, AJAX and php interaction • Ability to develop applications making use of APIs (Google Maps, Facebook, bespoke CRMs and similar) • Good design aesthetic, including familiarity with Photoshop and CSS • Substantial experience hand-coding • Familiarity with server administration including cPanel • Ability to design HTML newsletters • Progressive enhancement • AJAX application state-memory Salary : £30,000 to £40,000

    Read the article

  • How I can check gzip decoding time in the web browser?

    - by user293421
    I want to check the performance of the gzip decoding speed in a web browser. In the Java or c#, we can easily check the gzip decoding time. But I can not measure the decoding time in the web browser. plz help me. I want to check some decoding speed of gzipped html files. With JavaScript can I measure the performance.

    Read the article

  • MYSQL backup and restore

    - by Codezy
    I am having trouble getting mysql backups to run properly when their are views in the database. I think this might have something to do with needing a placeholder object for it. In any event I run this command: mysqldump -u myuser -pmypassword mydatabase | mysql -u myuser -pmypassword -C mydatabase_Beta The user has full privileges and I get this: View mydatabase_beta.yadayada references invalid tables or columns or functions or definer/invoker or view lack rights to use them. How can I back it up so that it restores all of my database properly? In the example I am restoring it to a different name but I do need to be able to restore a working copy. I think it is probably an additional mysqldump parameter or maybe hot copy would work better. Thoughts?

    Read the article

  • Using NTBackup to Backup Exchange 2003 Mail Stores

    - by Kyle Brandt
    I have netbackup backing up my Exchange stores stores to tape, but would like to maybe make the restore process faster. I have plenty of room on the array attached to the mail server, so I was thinking I could use NTBackup to do weekly backups in addition to my tape backups. Has anyone used this with good success?

    Read the article

  • SQL Server Restore from Backup, Just primary File Group

    - by bladefist
    Thankfully, this question is just a what-if, and I am not in an emergency right now. But I have created a file group in my database (sql server 2008), and moved some massive data tables over to it. Leaving my websites central tables in the Primary file group. In the event of a restore, can I restore just the primary file group, and have a working database? Or do I have to restore both file groups? I don't want my site down for ages while it restores the 2nd file group.

    Read the article

  • Cannot Access Web Interface on HP 2510G

    - by Stephen
    I am currently setting up a new infrastructure with HP 2510s as edge switches and an HP E5406 as the main switch. I also have a DHCP and DNS server running on the same network. When i first set up one of my 2510 switches, I gave it a static IP through the console and then went to the web interface to continue my configuration. Later, I realized that I assigned it the wrong IP address, so i went through the web interface and changed the IP address to the correct one. Now, I can't access the web interface. I can telnet to the switch on the new IP address, but the web interface will not load. If I switch from static IP to DHCP, it loads the web interface. Any ideas on what could be causing the web server in the 2510 not to load with the new static IP address?

    Read the article

  • Backup the Windows user folder in the cloud?

    - by Benjamin
    As I understand it, Google Drive and Dropbox, the two cloud storage providers I happen to know, can only sync a predefined folder that is created upon installation. I'd be happy to have an automated synchronisation of my folders in the cloud, but I'm not ready to change my habits, and start saving all my documents in the folder imposed by the provider. Is it possible with one of these, or any other you might know, to sync the full Windows user folder instead?

    Read the article

  • migrating puppet clients to a new puppet master (old puppet master server gone, only using backup)

    - by user47650
    My puppet master server had a hardware failure, and I have restored to another box. However this box has different hardware and hostname. If I restore the existing /etc/puppet directory to the new server, the puppetmaster will not start with the following error; # puppetmasterd --debug --verbose Could not prepare for execution: Retrieved certificate does not match private key; please remove certificate from server and regenerate it with the current key So what steps do I need to take to allow the new puppetmaster to start, and to generate a new puppetmaster certificate using the old ca.. Also will the puppet clients actually report in to a different puppet server using a server certificate that has been generated with the old CA?

    Read the article

  • Squid - Active Directory - permissions based on Nodes rather than Groups

    - by Genboy
    Hi, I have squid running on a gateway machine & I am trying to integrate it with Active Directory for authentication & also for giving different browsing permissions for different users. 1) /usr/lib/squid/ldap_auth -b OU=my,DC=company,DC=com -h ldapserver -f sAMAccountName=%s -D "CN=myadmin,OU=Unrestricted Users,OU=my,DC=company,DC=com" -w mypwd 2) /usr/lib/squid/squid_ldap_group -b "OU=my,DC=company,DC=com" -f "(&(sAMAccountName=%u)(memberOf=cn=%g,cn=users,dc=company,dc=com))" -h ldapserver -D "CN=myadmin,OU=Unrestricted Users,OU=my,DC=company,DC=com" -w zxcv Using the first command above, I am able to authenticate users. Using the second command above, I am able to figure out if a user belongs to a particular active directory group. So I should be able to set ACL's based on groups. However, my customer's AD setup is such that he has users arranged in different Nodes. For eg. He has users setup in the following way cn=usr1,ou=Lev1,ou=Users,ou=my,ou=company,ou=com cn=usr2,ou=Lev2,ou=Users,ou=my,ou=company,ou=com cn=usr3,ou=Lev3,ou=Users,ou=my,ou=company,ou=com etc. So, he wants that I have different permissions based on whether a user belongs to Lev1 or Lev2 or Lev3 nodes. Note that these aren't groups, but nodes. Is there a way to do this with squid? My squid is running on a debian machine.

    Read the article

  • How to install Web Deployment Agent

    - by Jerry
    I am trying to setup the TFS automated deploys. But I keep getting the following error message when running the deploy. It appears that I have a service called "Web Management Service", but the error message says that I need "We Deploy Agent Service". I tried installing Web Deploy 2.0, but the server said that I already had this installed. What can I do to fix this problem? Error Code: ERROR_DESTINATION_NOT_REACHABLE Could not connect to the destination computer ("myServer"). On the destination computer, make sure that Web Deploy is installed and that the required process ("Web Deployment Agent Service") is started. --Update-- Looks like the Web Deployment agent is not installed by default. I had to re-install MSDeploy, select Change or Custom, then add the Web Deploy Agent service. Now the deploy works correctly.

    Read the article

  • IT lead does not have a backup, DR plan in writing

    - by Alex
    This is a general management question to IT managers out there. We are a small firm with about 4 servers in our colo cabinent. No full time IT manager. But we do have one person on monthly contract and I am having a terrible time getting him to share what these plans actually are. I am sure he HAS a plan (and its probably in his head..) but that does us no good if he gets hit by a bus.. How would you guys handle this? He is a long time friend, but I fear this is dangerous for us long term..I have confronted him on several occasions about this, and he tells me not to worry, he has go it covered.. Thanks.

    Read the article

  • How to backup or export PowerStrip display profiles?

    - by Sk8erPeter
    I would like to save two of my saved PowerStrip display profiles. Earlier I set 720x540 resolution and some other settings (frequency, etc.) to another display device usually used in extended mode, which is now NOT connected: But when I go to "Advanced timing options", I see some different settings. I thought I could copy settings with the copy icon , but this way I would copy the wrong ones, not the predefined ones (with the 720x540 resolution): What is the best method to "export" these settings before formatting the hard drive?

    Read the article

  • How to set the VirtualDocumentRoot based on the files within

    - by Chuck Vose
    I'm trying to set up Apache to use the VirtualDocumentRoot directive but my sites aren't all exactly the same. Most of the sites have a drupal folder which should be the root but there are a few really old drupal sites, a few rails sites, some django sites, etc. that want the Document root to be / or some other folder. Is there a way to set up VirtualDocumentRoot based on a conditional or is there a way to use RewriteRule/Cond to detect that / is the incorrect folder if there is a drupal folder or a public folder? Here's what I have so far: <VirtualHost *:80> # Wildcard ServerAlias, this is the default vhost if no specific vhost matches first. ServerAlias *.unicorn.devserver.com # Automatic ServerName, based on the HTTP_HOST header. UseCanonicalName Off # Automatic DocumentRoot. This uses the 4th level domain name as the document root, # for example http://bar.foo.baz.com/ would respond with /Users/vosechu/Sites/bar/drupal. VirtualDocumentRoot /Users/vosechu/Sites/%-4/drupal </VirtualHost> Thanks in advance! -Chuck

    Read the article

  • How to make web icon open with specific browser?

    - by David
    I have an icon on my desktop for a website called QUAKE LIVE and I use Google Chrome as my default browser. The website isn't compatible with Google Chrome, but it with Mozilla Firefox. Is there any way to edit the properties of the icon to open with Firefox instead of Chrome?

    Read the article

  • Virtualbox - differencing disk based on different differencing disk

    - by Klinki
    I'm trying to create differencing image based on differencing image in VirtualBox 4.2.18. Official documentation says it should be possible: http://www.virtualbox.org/manual/ch05.html#diffimages Basically I want to achieve this drive hierarchy: + immutable image with Debian and all software installed +---- differencing image with specific configuration, autoreset=off, readonly +-------- differencing image with autoreset=on +---- another differencing image for different virtual machine +-------- differencing image with autoreset=on I successfully created differencing image based on differencing image, but I'm not able to connect it to virtual machine :( It always shows error: Failed to open the hard disk .... cannot register hard disk ... because hard disk with UUID ... already exists Here is screenshot of Virtual Media Manager and error dialog Virtual Media Manager Window screenshot Very strange is that the new differencing image (tempdrive.vdi) doesn't have Actual Size 0. I wasn't able to connect it, but still, it has 36KB of data on it... This is very similar to this older question: How to create a chained differencing disk of another differencing disk in Virtual Box? but suggested solution is not working anymore in VirtualBox 4.2.18, so I posted it as a new question. (Limit for posting links and screenshots is quite annoying..)

    Read the article

  • OSX 10.9 Time Machine backup to NAS

    - by user214577
    I recently upgraded from 10.6.8 to 10.9. on snow leopard i was able to make time machine backups over the network to my nas, i think i had to tweak some settings but i dont recall what i did. now that i upgraded to mavericks, i cannot do backups to my nas using time machine. my question is, what do i have to do to allow time machine backups over the network in 10.9? i tried looking for solutions online but did not find anything relating to mavericks.

    Read the article

  • DNS-Based Environment Determination

    - by zvolkov
    Found the following here. The questions is: where can I find more details on how exactly implement this on Windows? Any guide or how-to anybody? Or maybe you can provide your invaluable suggestions? Specifically, how do I make so that "all QA servers would first resolve entries in qa.example.com first and then if that lookup failed they would try example.com" (I'm a dev, not a DNS specialist, but our IT Support has refused to help on this:() Use DNS Based Environment Determination for your servers. Do this by initially splitting your top level domain into a number of sub domains depending on their function, and then creating DNS Service Names in each of the sub domains pointing to the relevant server for that service. Based on the list above we would then have: * clientdb.prod.example.com for Production * clientdb.perf.example.com for Performance Testing * clientdb.qa.example.com for QA * clientdb.dev.example.com for Development Servers then resolve entries in their relevant sub domain by function. That is, all QA servers would first resolve entries in qa.example.com first and then if that lookup failed they would try example.com. This allows you to have a single configuration entry for your client database hostname (clientdb) that would resolve correctly in all environments. This technique has the added advantage of still having global services defined in a common top level domain. This seems to be related to Providing "split horizon" DNS service. Reading that, I see that I will probably need separate DNS Server for each environment. Is this true or does Windows support some form of "tagging" the records to be visible depending on the requestor's IP?

    Read the article

  • Why is FTP server slowing down the web server?

    - by user1448031
    I am running Apache and Filezilla ftp server in Windows. I've been noticing for the last few months that whenever I start up the ftp server, the websites start to run slow. When I shut off the ftp server, the websites start speeding up again. Sometimes I need to restart Apache server after shutting off the ftp server to speed up the websites. I only start ftp server whenever I need to allow remote file transfers. Other than that it's always off. I'm not sure what's happening and where to look.

    Read the article

  • Allow Single IP to access ASP.NET Web Service (ASMX) using Firewall

    - by Suresh Agrawal
    I have one asp.net web service (asmx) in separate project that is hosted on windows server having other asp.net web applications running on it. How can I restrict asp.net web service to be accessed by single IP address? I want that my web service must be accessed by one IP configured by me. If requests comes from any other IP, it must not reach to my web service and discarded by windows firewall itself. I know that this is something to do with windows firewall. I did it for SQL Server previously, but I don't know how to configure single asp.net web service project to do so.

    Read the article

  • Secretary cannot add appointments to boss's calendar after exchange restore from backup

    - by therulebookman
    The calendar is the Boss's calendar on Exchange. I have set permissions for it through his Outlook to give the secretary and a few other people "Editor" access to his calendar. All the editors can view the calendar, but only he can add new appointments. Anyone else who tries to add an appointment gets "The item cannot be saved in this folder. The folder was deleted or moved or you do not have permission." The permissions are correct, editor. The item hasn't been deleted or moved. It's in his mailbox on exchange. The message says something about the mailbox size, but he is well under the size limit anyway. He is using Outlook 2003, and I have tried accessing it from 2003 and 2007, but I don't think that is related I tried clearing the forms cache and enabling disabled items: no disabled items and clearing cache didn't help. I also tried "Allow all forms" but this apparently doesn't apply in this scenario as we are not using any custom forms. Is there any way to delete just his calendar and then I can exmerge it back in (after exporting to PST of course)? I really can't exmerge out his mailbox, delete it, and exmerge it back in because he works all sorts of hours, but if this is the only way, then I'll have to do it. Is there any other possible solution?

    Read the article

< Previous Page | 225 226 227 228 229 230 231 232 233 234 235 236  | Next Page >