Search Results

Search found 9952 results on 399 pages for 'big al'.

Page 190/399 | < Previous Page | 186 187 188 189 190 191 192 193 194 195 196 197  | Next Page >

  • Why does't rsync use delta-transfer for local files ?

    - by o_O Tync
    I have a big iso image which is currently being downloaded by a torrent client with space-reservation turned on: that means, file size is not changing while some chunks in in (4 Mib) are constantly changing because of a download. At 90% download I do the initial rsync to save time later: $ rsync -Ph DVD.iso /some/target/ sending incremental file list DVD.iso 2.60G 100% 40.23MB/s 0:01:01 (xfer#1, to-check=0/1) sent 2.60G bytes received 73 bytes 34.59M bytes/sec total size is 2.60G speedup is 1.00 Then, when the file's fully downloaded, I rsync again: total size is 2.60G speedup is 1.00 Speedup=1 says delta-transfer was not used, although 90% of the file has not changed. Why?!

    Read the article

  • Spam mail through SMTP and user spoofing

    - by Josten Moore
    I have noticed that it's possible to telnet into a mailserver that I own and send spoofed messages to other clients. This only works for the domain that the mail server is regarding; I cannot do it for other domains. For example; lets say that I own example.com. If I telnet example.com 25 I can successfully send a message to another user without authentication: HELO local MAIL FROM: [email protected] RCPT TO: [email protected] DATA SUBJECT: Whatever this is spam Spam spam spam . I consider this a big problem; how do I secure this?

    Read the article

  • Outlook is trying to retrieve data from Microsoft Exchange

    - by Adrian Baker
    Some of my users on my network keep getting a pop up message saying "Outlook is trying to retrieve data from Microsoft Exchange". Microsoft Office 2003 is installed on these computers and when this message appears it freezes every users outlook. I have read some articles, and they say it a problem with 2002 and 2003 version of outlook. So I have changed some users Office to MS Office 2007 but still I have the same problem. This is become a big problem as it is happening often and users are getting annoyed. If anyone can give me some advise on how to solve this problem and what could be causing this problem I would be most grateful.

    Read the article

  • azure website restart and take old dll version

    - by vipul dumaniya
    One of my site is hosted on windows azure and when azure restart site from manage windows azure panel. then it take old version dll and site is down until we restart the site by deploying global.asax or change in web.config to restart the site. after deployment of global.asax or change in web.config site is restart and then it work perfectly and take latest dll. so if any issues with my code then it should not work after the restart by deploying global.asax file so i think issues is not from code side. Error like "Could not load type 'DSF.DATA.Repository.RecurringOrderLogResposity' from 'DSF.DATA Version 1.0.0" I am just deploying changed dll using FTP & site restart and take effect successfully I have already resolve this error and uploaded latest dll too but when site restart from azure panel it back and then site down until i restart the site by deploying global.asax file so i think issues is not from code side. please please help I am in big trouble as my site is live site and there are lot of traffic Thanks Vipul

    Read the article

  • How do I put back different SCSI hard drives into their original RAID arrays across different servers?

    - by Edgar
    I have potentially a big mess in my hands: I received today a box with several hard drives that used to be connected to different servers each one of them using an unknown - at least as of right now- RAID configuration. Regretfully, these are not marked and I'm not sure how to go about putting them back into their original servers. Currently I don't have much more information: I don't know what type of array was being used on each instance and I don't have any specifics about the RAID controller originally used on each one of the servers (currently these servers are at a remote location with no easy access). Is there a way to sort through this mess? What would be the consequences of using trial and error to go about it? This might be a very basic question but I don't have much experience dealing with RAID arrays.

    Read the article

  • Can irssi ignore the 24h dsl-reconnect

    - by mcnesium
    A couple of weeks ago I had to switch my ISP from cable to DSL. Now I have this ridiculous disconnect and reconnect every 24h. It's no big deal insofar as having a new IP address every day, but for one exception. Since I host my irssi screen on a machine inside the LAN, my history gets affected by the reconnect in terms of a topic announcement, the users in each channel, creation date and so on. It's about 10 lines of redundant content every day. This is annoying especially in channels with very little traffic, because you hardly see the actual content in line with the every-day-junk. So I was wondering if I can tell irssi to silently ignore the reconnection details, so that my only meta-content in each channel goes back to "Day changed to ...", like back in the days of cable-internet.

    Read the article

  • High Lock Wait ratio in MySQL

    - by FunkyChicken
    on my site I log every pageview (date,ip,referrer,page,etc) in a simple mysql table. This table gets very little selects (3 per minute), but a lot of inserts. (about 100 per second) Today I changed this table from an InnoDB table to a MEMORY table, this made sense to me to prevent unnecessary hard disk IO. I also prune this table once per minute, to make sure it never get's too big. -- Performance wise, things are running fine. But I noticed that while running tuning-primer, that my Current Lock Wait ratio is quite high. Current Lock Wait ratio = 1 : 561 My question: Should I worry about this Lock Wait Ratio? And is there something I can change in my my.cnf to improve things so that the lock wait ratio isn't so high?

    Read the article

  • how to correctly download tomcat 6 on centos 5.5

    - by user582862
    hi guys, i am a big confused about how to install tomcat 6 on centos 5.5 final. this is what i am trying to do: # cd /etc/yum.repos.d/ # wget http://jpackage.org/jpackage50.repo # yum install tomcat6 tomcat6-webapps tomcat6-admin-webapps but when i type the widget command, this is what i get: Resolving www.jpackage.org... failed: Temporary failure in name resolution. wget: unable to resolve host address `www.jpackage.org' could anyone kindly show me the right way please. really in trouble at the moment with this. thanks in advance.

    Read the article

  • How to make local apache server public/visible ?

    - by George
    Hello. I am running an Apache2 server on a Fedora 13. I'd like to make it publicly accessible(visible).For example I'd like when somebody types http://my.ip.numbes/ that they would see what I have in my document root folder. Just for a presentation of a course work at university. Permissions are set to 755. User owning the document root is apache. SELinux is temporarily disabled. But port 80 is closed. I tried to open it by adding an entry to iptables and restarting them, no change. I guess I am missing something big here. Help would be greatly appreciated. Note: I have a static (public, real) IP address.

    Read the article

  • monitoring TCP/IP performance on Solaris

    - by Andy Faibishenko
    I am trying to tune a high message traffic system running on Solaris. The architecture is a large number (600) of clients which connect via TCP to a big Solaris server and then send/receive relatively small messages (.5 to 1K payload) at high rates. The goal is to minimize the latency of each message processed. I suspect that the TCP stack of the server is getting overwhelmed by all the traffic. What are some commands/metrics that I can use to confirm this, and in case this is true, what is the best way to alleviate this bottleneck?

    Read the article

  • In Outlook 2010, can you add "Categories" to the "New Email" Ribbon?

    - by Jeff
    I couldn't figure out how to do this in Outlook 2007, and I was hoping I could do it in Outlook 2010... I want the ability to quickly apply a category when composing a new email (typically a "Waiting For..." category) for things that need a response. It is possible to apply a category by clicking the "Options" ribbon, then the little arrow under the More Options section - but why can't I get the nice big "Categories" drop-down that's available in the "Tags" section of the main Outlook window. There are about a kabillion commands in the "Customize Ribbon" dialog box for the New Mail window, but I couldn't find anything about Categories. Should I just give up?

    Read the article

  • Upload large database SQL file

    - by Devy
    I've a database of more than 20Gb of size on my hard disk. What is the best way to upload it with the least (money) load possible on the server? - I'm on Windows 7. - I have FTP and SSH access on the server. I avoid using FTP because my connection cuts off a lot, I can't imagine I re-upload again the file after failing on 99%. I found some tools that split the large .sql file to small .sql files, but they didn't mention how to gather these files again into one file. Another way is to archive the big .sql file to .rar with -v option, upload them through FTP then unpack them. But unpacking will also cost, right? I know it will cost in any cases, but any best practice will be strongly appreciated.

    Read the article

  • MariaDB, Galera, xtrabackup - do I need the binary log?

    - by bernhardrusch
    We are using a MariaDB Galera Cluster with 3 nodes. For the state transfer we are using xtrabackup. We have some problems with the binary logs - they got too big and crashed the server. We can remove them manually with the purge binary logs command, another way would be to set the expire_logs_days so they would expire. I now that we could use xtrabackup to backup the DB and use the binlog to get to some point in time. But do we really need it for Galera to work ?

    Read the article

  • .htaccess ignored, SPECIFIC to EC2 - not the usual suspects

    - by tedneigerux
    I run 8-10 EC2 based web servers, so my experience is many hours, but is limited to CentOS; specifically Amazon's distribution. I'm installing Apache using yum, so therefore getting Amazon's default compilation of Apache. I want to implement canonical redirects from non-www (bare/root) domain to www.domain.com for SEO using mod_rewrite BUT MY .htaccess FILE IS CONSISTENTLY IGNORED. My troubleshooting steps (outlined below) lead me to believe it's something specific to Amazon's build of Apache. TEST CASE Launch a EC2 Instance, e.g. Amazon Linux AMI 2013.03.1 SSH to the Server Run the commands: $ sudo yum install httpd $ sudo apachectl start $ sudo vi /etc/httpd/conf/httpd.conf $ sudo apachectl restart $ sudo vi /var/www/html/.htaccess In httpd.conf I changed the following, in the DOCROOT section / scope: AllowOverride All In .htaccess, added: (EDIT, I added RewriteEngine On later) RewriteCond %{HTTP_HOST} ^domain\.com$ [NC] RewriteRule ^/(.*) http://www.domain.com/$1 [R=301,L] Permissions on .htaccess are correct, AFAI can tell: $ ls -al /var/www/html/.htaccess -rwxrwxr-x 1 git apache 142 Jun 18 22:58 /var/www/html/.htaccess Other info: $ httpd -v Server version: Apache/2.2.24 (Unix) Server built: May 20 2013 21:12:45 $ httpd -M Loaded Modules: core_module (static) ... rewrite_module (shared) ... version_module (shared) Syntax OK EXPECTED BEHAVIOR $ curl -I domain.com HTTP/1.1 301 Moved Permanently Date: Wed, 19 Jun 2013 12:36:22 GMT Server: Apache/2.2.24 (Amazon) Location: http://www.domain.com/ Connection: close Content-Type: text/html; charset=UTF-8 ACTUAL BEHAVIOR $ curl -I domain.com HTTP/1.1 200 OK Date: Wed, 19 Jun 2013 12:34:10 GMT Server: Apache/2.2.24 (Amazon) Connection: close Content-Type: text/html; charset=UTF-8 TROUBLESHOOTING STEPS In .htaccess, added: BLAH BLAH BLAH ERROR RewriteCond %{HTTP_HOST} ^domain\.com$ [NC] RewriteRule ^/(.*) http://www.domain.com/$1 [R=301,L] My server threw an error 500, so I knew the .htaccess file was processed. As expected, it created an Error log entry: [Wed Jun 19 02:24:19 2013] [alert] [client XXX.XXX.XXX.XXX] /var/www/html/.htaccess: Invalid command 'BLAH BLAH BLAH ERROR', perhaps misspelled or defined by a module not included in the server configuration Since I have root access on the server, I then tried moving my rewrite rule directly to the httpd.conf file. THIS WORKED. This tells us several important things are working. $ curl -I domain.com HTTP/1.1 301 Moved Permanently Date: Wed, 19 Jun 2013 12:36:22 GMT Server: Apache/2.2.24 (Amazon) Location: http://www.domain.com/ Connection: close Content-Type: text/html; charset=UTF-8 HOWEVER, it is bothering me that it didn't work in the .htaccess file. And I have other use cases where I need it to work in .htaccess (e.g. an EC2 instance with named virtual hosts). Thank you in advance for your help.

    Read the article

  • Sharing files between multiple sites using only desktop software

    - by perlyking
    Our organisation has three sites; a head office, where the master copies of company files are stored, plus two branch offices using only workstations and a NAS or two. Currently we're talking about <10GB. At the main office, we have no admin access to the file server, as this is entirely controlled by the larger institution where we are located. For the same reason, we have no VPN remote access to this network. Instead, we simply have access to a network share using over a Novell LAN. Question: how can we share files between offices in way that minimises latency, i.e. that gives us a mirror of the main network share at each site? (There is little likelihood of concurrent editing, and we can live with the odd file conflict now and again). Up to now branch office staff have had to use GotoMyPC-type solutions to remotely access files held at the main office. Or email. I was hoping to use Google Drive on a dedicated workstation at each office to sync the contents of the network share (head office) or NAS (branch offices) via the cloud, but at my last attempt (29 Jun '12), the Google Drive installer would not allow me to designate the remote network share as the "target" folder. (I chose Google Drive over Drobbox et al. as we already use GMail for corporate mail) The next idea was to use a designated workstation at head office to mirror the network share to a local drive, then use Google Drive to push that to the cloud. This seems a step too far. Nor do I have any good ideas about how to achieve this network/local mirroring, as we can't, for example, install the rsync daemon on the server. I do not want to use Google Drive locally on each workstation as this will inconvenience users, and more importantly, move files off the backed-up, well-maintained (UPS, RAID etc) network share at head office. Our budget is only in the £100's. Should we perhaps just ditch the head office server and use something like JungleDisk? At least this presents the user with what appears to be a mapped drive.

    Read the article

  • Smart backup software

    - by gisek
    I use a laptop on daily basis. As I have a lot of important data there it would be nice to do backups of some directories every day. Can you recommend a specific application that would take care of it? Maybe there is an app that would instantly commit changes I make in a directory on my laptop to the backup folder? The important thing is that I have some big files (a few GB's) that have some minor changes very often. I'm talking about VirtualBox disk images. It would be nice if the software could handle it smartly. Also notice that I'd like to store it on an external usb HDD, which sometimes isn't plugged in.

    Read the article

  • yum update with shared cache

    - by Sammitch
    We've got a big batch of RHEL6 machines that are due for patching, and for some reason the process here does not involve a local repo. I'm new here, I've asked why, ["it just didn't work"] and I don't have enough time to make it work before the window that's already scheduled. So the usual method is to install yum-downloadonly and run yum update --downloadonly --downloaddir=/mnt/cifs_share and then yum update /mnt/cifs_share/*.rpm which just does not look right to me since not all of these machines have the same set of installed packages. The method I tried today was mounting the share to /var/cache/yum/x86_64/6Server/rhel-x86_64-server-6/packages/ which worked, but then yum automatically deleted everything once it finished. I've looked over the yum man page, but I don't see any flag I can feed it to stop it from deleting everything, nor a flag like up2date's --tmpdir=/mnt/cifs_share. Can anyone out there help me kludge this together until I can get a local repository working?

    Read the article

  • After making boot disk using rufus in usb 3.0 port, it doesn't work in another ports.

    - by sin
    I have a big problem! After making boot disk using rufus in usb 3.0 port, it doesn't work in another ports. I have to install windows in other PC which has usb 2.0port only! so I made usb booting disk using usb 3.0 in 3.0 port. but after that it never worked except in 3.0 ... and I could't restore my usb. I alredy fomated in cmd and others in usb3.0 port! but there is no change. and I couldn't find it in usb 2.0 port. plz help me.

    Read the article

  • New Windows 7 Install Crashing

    - by bobber205
    One big reboot crash and one smaller crash already, 15 minutes in. Did a basic install of Windows 7, installed Chrome and Firefox. I had just finished loading up my gmail account in Chrome/Firefox to show the speed difference and we'd thought it would be hilarious to see how slow IE8 was. :P Just about as IE8 was done opening, the computer's screen goes black. After a restart and a couple minutes, Explorer crashes as well. What is going on? This install is only 15-20 minutes old. :P

    Read the article

  • Drupal + GMap macro, map markers not displayed

    - by mingos
    Hi. I've run into a strange problem in the GMap module for Drupal. When I display a map inside a node using a GMap macro, everything is displayed correctly (according to what I specify in the macro or leave at default), save for map markers. I'm trying to specify a map marker, but it refuses to be displayed. My macro is the following: [gmap zoom=17 |center=53.77420697757659,20.474138259887695 |markers=big blue::53.77420697757659,20.474138259887695] I was unable to find any help on Drupal forums, both the official one and one local to my country. For completeness' sake, I do not wish to use a GMap view, just add a macro in a regular node. Hope you can help me find a solution. Thanks in advance for your replies...

    Read the article

  • Web log files analyzer

    - by Peter Štibraný
    I already use Google Analytics on my page, but I'd like to get additional info from log files. I've looked at various packages during last days, but nothing impressed me so far. Some requirements: must work on log file level (I use apache combined logs, but can configure apache to produce other types of logs) can generate static reports (windows/linux) or use GUI (windows only) should be easy to add custom user agents, and rerun analysis if it can recognize installation of eclipse plugins from log, that would be big plus understands google serp position referer should not require two days to setup (awstats, I am looking at you) should be still under active developement (i.e. analog isn't good answer) preferrably free, or at not very expensive :-) Any good analyzers programs out there?

    Read the article

  • video card for only watching videos

    - by Nothing 2 Lose
    I recently quit gaming so I took out my (GeForce GTX 550 Ti) video card and switched back to the (cheap) integrated card because I don't want the dedicated card using up power & making unnecessary heat/noise. But now I get lag when I watch high-res movies, or when I open several videos at the same time, but I don't want to go back to my (big/expensive) video card because that seems like overkill. What is the smallest/cheapest card which will only be good enough for watching videos without lagging ? (but not for gaming) PS. My CPU is: AMD Phenom 9850 Quad-Core Processor

    Read the article

  • Convert old videos to have smaller sizes

    - by Tim
    I have some videos from a few years ago,with various formats, such as avi, mpg, wmv, rm, rmvb, .... Their sizes are huge(more than 500 MB, and sometimes 1GB). Given there may likely be some advance in data compression, I would like to know which file formats and compression methods are recommended these days, by the standard that without losing obvious data, while achieving big size reduction. How can I perform the file format conversion and data compression in Ubuntu 12.04? Command line and batch ways would be the most convenient, although GUI ways are also appreciated. Thanks and regards!

    Read the article

  • Image archiving on network folders.

    - by Steve
    Our company uses Symantec Enterprise Vault to archive network folders and files, presumably to save on disk space. I can't see any other benefits. Our company is an architectural firm, and the problem our users face is locating particular images in network folders. Because all the files are archived, Windows Explorer is unable to generate image thumbnails. Each image needs to be individually restored from archive by double-click before it can be viewed. This is a big time-waster for our architects. Symantec say there is no workaround for this. Does anyone know of an alternative we could use for archiving images? Alternatively, some batch utility to create and maintain a thumbs.db file in each folder? Thanks.

    Read the article

  • Crossplatform "jail" for an application

    - by Alexander
    We currently have a variety of systems (Linux, Solarix, *BSD, HP-UX ...) on which we are not allowed to install anything into / (but I have root access. That's strange, I know). But we'd like to run Puppet on all of them. So, the obvious idea is to install Puppet with all prebuilt dependencies into some isolated tree, something like "jail", which will allow to use dependences from some prefix and to access the host system. The big advatanges would be uniform deployment and updates. One solution that came to my mind is to deploy Gentoo Prefix, and install Puppet there with package manager. However, this requires a lot of extra space and some manual patching for each system. Maybe there are some more elegant and simple solutions?

    Read the article

< Previous Page | 186 187 188 189 190 191 192 193 194 195 196 197  | Next Page >