Search Results

Search found 9199 results on 368 pages for 'topological sort'.

Page 268/368 | < Previous Page | 264 265 266 267 268 269 270 271 272 273 274 275  | Next Page >

  • During Vista Repair - No operating system is listed.

    - by Jack Marchetti
    After a Windows update, my brother's Gateway computer loads to the "Step 3 of 3: 0%" and reboots. Safe Mode does not work. I placed a Vista DVD in the drive, and re-booted. (Note, this is my Vista DVD, not the Recovery/System disc that would come with a computer. Gateway does not give you CD's anymore. I believe they store recovery on a partition, but that partition has been wiped out). I chose "Repair Your Computer" I get a dialog box, but no operating system is listed. I'm then prompted to "Load Drivers". What drivers am I supposed to be loading here and where from? I placed a CD in the drive to "load drivers" but I don't see my DVD drive listed. All I saw where X:/Sources along with several Removable Media slots that were empty. On another screen I tried Startup Repair, which didn't do anything. I attempted to use System Restore - but it doesn't detect the hard drive. I'm guessing that I'm missing some sort of SATA driver and that is why the hard disk is not being found. Any ideas on this?

    Read the article

  • emails not sending from CentOS 5.6 VM on Win7 via PHP code

    - by crmpicco
    I am experiencing an issue where my CentOS 5.6 (Final) VM running on Windows 7 has stopped sending emails from my PHP code. I'm confident this isn't a coding issue as I have the exact same code running in my office and emails send correctly from there, hence why I believe this to be a networking/configuration issue. In my /etc/hosts/ file on my VM I have the following: 127.0.0.1 localhost.localdomain localhost 192.168.0.9 crmpicco.co.uk m.crmpicco.co.uk dev53.localdomain When I run setup on my VM the DNS configuration is set to dev53.localdomain and my Primary DNS is 192.168.0.1. In My /var/log/maillog files I see a lot of this sort of thing: Nov 19 14:36:58 dev53 sendmail[21696]: qAJEawI7021696: from=<[email protected]>, size=12858, class=0, nrcpts=1, msgid=<1353335817.9103820024efb30b451d006dc4ab3370@PHPMAILSERVER>, proto=ESMTP, daemon=MTA, relay=localhost.localdomain [127.0.0.1] Nov 19 14:36:58 dev53 sendmail[21693]: qAJEawvd021693: [email protected], [email protected] (48/48), delay=00:00:00, xdelay=00:00:00, mailer=relay, pri=42681, relay=[127.0.0.1] [127.0.0.1], dsn=2.0.0, stat=Sent (qAJEawI7021696 Message accepted for delivery) Nov 19 14:36:59 dev53 sendmail[21698]: qAJEawI7021696: to=<[email protected]>, delay=00:00:01, xdelay=00:00:01, mailer=esmtp, pri=132858, relay=mailserver.fletcher.co.uk. [213.171.216.114], dsn=5.0.0, stat=Service unavailable Is this likely to be a configuration issue?

    Read the article

  • Can Safari 5.1 for Mac OS display favicons for bookmarks in the Bookmarks Bar?

    - by Greg R.
    When bookmarking a web site, most contemporary browser will display the site's favicon next to the bookmark, both in the bookmark view and the bookmark toolbar. This is a useful feature. In the bookmark toolbar you can edit the name of the bookmark to be blank, effectively leaving the favicon there as an easily identifiable "button" from which to launch the bookmark. This allows you to make more effective user of the space in the bookmark toolbar. I use this approach effectively in Firefox, Chrome, and IE. For example, here is a portion of my Bookmarks Toolbar from Firefox: However, in Safari, no favicon is ever displayed for bookmarks. In the full bookmark view only a generic globe icon is displayed. In the Bookmark Bar in Safari, no icon at all is displayed. Which means the habit of removing the bookmark name & leaving the favicon is useless. Here's what the same configuration (synced between browsers via Xmarks) looks like in Safari. That blank space is where the favicons should be. The boomark is there -- if you hover over it, the blank space changes color to indicate the presence of a bookmark and a tool tip will with the URL will pop up after about two seconds. However, it's really quite unusable. So. The question: is there an extension, plug-in, or modification of some sort that will enable the display of favicons for bookmarks in Safari (OS X Lion 10.7.3 , Safari version 5.1.3)?

    Read the article

  • Fully FOSS EMail solution

    - by Ravi
    I am looking at various FOSS options to build a robust EMail solution for a government funded university. Commercial options are to be chosen only in the worst case scenario. Here are the requirements: Approx 1000-1500 users - Postfix or Exim? (Sendmail is out;-)) Mailing lists for different groups/Need web based archive - Mailman? Sympa? Centralised identity store - OpenLDAP? Fedora 389DS? Secure IMAP only - no POP3 required - Courier? Dovecot? Cyrus?? Anti Spam - SpamAssasin? what else? Calendaring - ?? webmail - good to have, not mandatory - needs to be very secure...so squirrelmail is out;-)? Other questions: What mailbox storage format to use? where to store? database/file system? Simple and effective HA options? Is there a web proxy equivalent to squid in the mail server world? software load balancers?CARP? Monitoring and alert? Backup? The govt wants to stimulate the local economy by buying hardware locally from whitebox vendors. Also local consultants and university students will do the integration. We looked at out-of-the-box integrated solutions like Axigen, Zimbra and GMail but each was ruled out in favour of a DIY approach in the hopes of full control over the data and avoiding vendor lockin - which i though was a smart thing to do. I wish more provincial governments in the developing world think of these sort of initiatives As for OS - Debian, FreeBSD would be first preference. Commercial OS's need not apply. CentOS as second tier option...

    Read the article

  • Typical outbound port list for guest access?

    - by Steve
    I manage a weekly rental house that includes wireless Internet access. I've allowed all outbound ports on my router but my ISP has disabled my Internet access twice now because guests have downloaded (or served up) copyrighted content. So I'd like to institute some port filtering to discourage p2p sharing (see disclaimer below). But I don't want to inconvenience the 99.9% of folks who keep things above-board. My question is, what outbound ports are typically open for rental/hotel wireless Internet access, or where can I find such a list? TCP 80,443,25,110 at a minimum. Though my own email service uses 995 and 465 for SSL, some may use IMAP, I personally use SSH and FTP, so I'll open those. Roughly I figure I need to open access to privileged ports, and close 1024 & above. Is there a whitelist I should institute for commonly used high ports? And does it make sense to block UDP 1024 ? Disclaimer: I realize anyone replying to this message could circumvent the port filtering and share content to their heart's content. I do not need comprehensive p2p blocking, which requires more than a port whitelist. Anyone staying at the house shoulders the responsibility for their Internet use, per the rental contract. Also anyone savvy enough to circumvent the port filters would hopefully be savvy enough to use some sort of peer blocking, thereby preventing the ISP from taking down the service.

    Read the article

  • HAproxy with MySQL Master-Master Replication incredibly slow

    - by Yayap
    I have two MySQL servers in multi-master mode, with an HAproxy machine for simple load balancing/redundancy. When I am connected to one of the servers directly and try to update about 100,000 entries, it is completed including replication in about half a minute. When connecting through the proxy it takes usually over three whole minutes. Is it normal to have that type of latency? Is something amiss with my proxy configuration (included below)? This is getting really frustrating as I assumed the proxy would do some sort of load balancing, or at least have little to no overhead. #--------------------------------------------------------------------- # Example configuration for a possible web application. See the # full configuration options online. # # http://haproxy.1wt.eu/download/1.4/doc/configuration.txt # #--------------------------------------------------------------------- #--------------------------------------------------------------------- # Global settings #--------------------------------------------------------------------- global # to have these messages end up in /var/log/haproxy.log you will # need to: # # 1) configure syslog to accept network log events. This is done # by adding the '-r' option to the SYSLOGD_OPTIONS in # /etc/sysconfig/syslog # # 2) configure local2 events to go to the /var/log/haproxy.log # file. A line like the following can be added to # /etc/sysconfig/syslog # # local2.* /var/log/haproxy.log # log 127.0.0.1 local2 # chroot /var/lib/haproxy # pidfile /var/run/haproxy.pid maxconn 4096 user haproxy group haproxy daemon #debug #quiet # turn on stats unix socket stats socket /var/lib/haproxy/stats #--------------------------------------------------------------------- # common defaults that all the 'listen' and 'backend' sections will # use if not designated in their block #--------------------------------------------------------------------- defaults mode tcp log global #option tcplog option dontlognull option tcp-smart-accept option tcp-smart-connect #option http-server-close #option forwardfor except 127.0.0.0/8 #option redispatch retries 3 #timeout http-request 10s #timeout queue 1m timeout connect 400 timeout client 500 timeout server 300 #timeout http-keep-alive 10s #timeout check 10s maxconn 2000 listen mysql-cluster 0.0.0.0:3306 mode tcp balance roundrobin option tcpka option httpchk server db01 192.168.15.118:3306 weight 1 inter 1s rise 1 fall 1 server db02 192.168.15.119:3306 weight 1 inter 1s rise 1 fall 1

    Read the article

  • Test server on a local network with XAMPP

    - by hopscotch1978
    Hi, I'm not very proficient with networks and could use some help. I've got a Win 7 desktop with XAMPP which acts as my local dev machine. I've configured a virtual host on the desktop which I'm able to access fine. If I'm understanding things correctly, the virtual host uses port 80 (<VirtualHost 127.0.0.1:80>). I've just tried to configure a separate Win XP laptop on the local wireless network to connect to the main desktop for testing purposes. I've added the IP address and virtual host name to my Hosts file on the laptop. My virtual host is imaginatively named "virtualhost1". When I type this into my laptop browser, it connects correctly to the main desktop and I get the XAMPP welcome screen. But I can't seem to get to the actual site, just the XAMPP welcome screen. It kind of jumps the browser to http://virtualhost1/xampp/. I think it's a port issue of some sort but I have no idea how to resolve it. I would get the same XAMPP welcome screen on my desktop if I omitted ":80" from the virtual host declaration. On my main desktop, typing "virtualhost1" to the browser address bar gives me the site correctly, not the XAMPP welcome screen. Help would be appreciated. Thank you.

    Read the article

  • Vanishing Windows Desktop Shortcut Keys

    - by Henry Keiter
    The Situation Like you, I have many applications that I like to open. I've set up keyboard shortcuts for the most common, by placing a link on the desktop and setting its Shortcut Key property: This is all fine and dandy, most of the time. When I want to bring up the GIMP, I press Ctrl+Alt+G and the GIMP launches. Lovely. The Problem Sometimes--perhaps once a month per desktop shortcut--the shortcut key assignment simply vanishes. I press Ctrl+Alt+G and nothing happens, so I go check the shortcut and see that lo and behold: nothing is there. This happens regularly to all my shortcuts (not all at once). It doesn't matter what keys I assign, and there doesn't seem to be any correlation with particular applications being open or anything of that sort. This has happened on every Windows XP machine I've ever used with any regularity. Obviously what makes this issue particularly obnoxious is that it's not easily reproducible. I have searched long and hard for a solution for (or at least acknowledgement of) this problem, to no avail, so hopefully you guys know something that I don't. I did find this question, where the answers are all basically "use a third-party app", but as far as I could tell that was a slightly different issue, related to Explorer being busy. If the solution for this turns out to be the same, fine, but I'd prefer a native fix if at all possible. Note: I've tagged this with Windows in general because I seem to remember it happening on Windows 7 as well as XP, but I rarely notice it because I use the start-menu search in preference to desktop shortcuts.

    Read the article

  • Mac Backup Plan

    - by Chuy77
    I'm reviewing my backup plan and would appreciate any thoughts about what more I should do (if anything) to make sure I'm properly covered in case of all hell breaking loose. :-) I have one machine. 1) I run a nightly clone with SuperDuper. I alternate the clone drive weekly so I have two clones, one never more than a week old. 2) I use BackBlaze as a sort of Time Machine in the cloud. It runs all the time and keeps everything on my machine backed up online. 3) I sync all my 1Password logins, etc. to my iPhone once a week. ...And that's it. I feel pretty covered. But I'm always reading stuff like this: http://www.43folders.com/2010/03/15/yes-another-backup-lecture And that doesn't even mention online backup, and seems like a huge pain in the behind. But maybe I'm being naive? Should I have more backups? Thanks for any feedback. I really appreciate it.

    Read the article

  • transparent git-svn gateway

    - by azatoth
    Currently we have an subversion repository with the following layout: /trunc /group1 /proj1 /proj2 group2 /proj3 /etc.. /tags /group1 /proj1 /proj2 group2 /proj3 /etc.. /branch /anything temporary I believe this is an rather bad layout, but at the moment it's difficult to change it fully. Personally I dislike subversion, due mostly the long time it takes to check history, and also that branching and merging are cumbersome etc. so I really want to use git instead. Sadly we cant just switch to git as the mental capacity for some might be to overwhelming, so I was looking into git-svn to see if I could practically use that to solve the issue. Sadly that directly ends up in a bad situation as I want to break down each project into one git repo, and I don't want to have to recreate the git-svn checkout on each computer I work on. so I though perhaps there is an possibility to create some sort of transparent git ?? svn proxy/gateway, so that an push to that repo "commits" to the svn repo, and an commit to the svn repo updates the git repo. Google hasn't been my friend, have only found generic usage help to use git-svn, so I ask you if you have some good ideas to accomplish this.

    Read the article

  • Parsing text files

    - by d03boy
    I encountered a situation tonight where I wanted to parse a text file. I had a very, very long word list that contained English words delimited by lines. I wanted to get rid of every word (or line) that was longer than 7 characters. This would be simple in Linux but I can't seem to find a simple solution in WindowsXP. I tried using Notepad++ regular expression search but that was a huge failure. I tried using the expression .{6,} without finding any matches. I'm really at a loss because I thought this sort of thing would be extremely easy and there would be tons of tools to accomplish a task like this. It seems like Notepad++ supports every other feature in the world except the very basic ones that seem the most obvious. Another one of my goals was to put some code before and after the word on each line. aardvark apple azolio would turn into INSERT INTO Words (word) VALUES ('aardvark'); INSERT INTO Words (word) VALUES ('apple'); INSERT INTO Words (word) VALUES ('azolio'); What suggestions/tools/tips do you have to accomplish tasks similar to this in WindowsXP?

    Read the article

  • Load balanced IIS. Should I use NLB, or linux-based reverse proxy, or something else?

    - by growse
    What would be the best approach for load-balancing at least 2-3 Windows 2008 R2 IIS webservers running a multitude of .NET applications? My choices appear to be: 1) Hardware-based network device load balancer, like a Cisco CSS 2) Windows NLB 3) Some sort of linux based proxy, either haproxy or other The three servers sit as VMs on a vSphere farm, so I have the ability to clone to up the instance count in times of high load. I control the switch that the vSphere hosts are plugged into (Cisco 3750), but don't control the switching/routing infrastructure beyond that to the clients. (1) Is too expensive, and probably overkill for my needs. I've included this in case someone figures out a cunning way to do it on my existing network kit, which I doubt. (2) would seem to be the obvious "built-in" option, but seems to be quite fiddly messing around with network interfaces, multicast, and generally other things that seem to be needlessly complex. It's also fairly stupid, in that it can't remove hosts from the pool if they start throwing 500 errors or otherwise go wrong (3) is the most interesting option, as it would appear to offer the most flexibility and customizability, but without having to mess around with the network. However, while I'm familiar with the reverse-proxy capabilities of lighttpd etc, I'm not that well read on other options like HAProxy, which might be able to offer a lot more. Which would you go for, and is there anything I've not thought of?

    Read the article

  • Looking for suitable backup solution Mac OS X to offsite Centos 6 server 1TB of working data

    - by Brady
    I'll start by saying what we have in place currently: On site file server (Mac OS X Server) that is used by GFX designers and they have a working 1TB of data. Offsite server with 2TB available storage (Centos 6) Mac OS X server rsync data to offsite server every 6 hours (rsync -avz --delete --progress -e ssh ...) Mac OS X server does full data backup to LTO 4 tape on a 10 day recycle (Mon-Fri for 2 weeks) rsync pushes about 60GB of file changes a day. The problem: The onsite tape backup is failing as 1TB of graphics files don't compress well to fit onto a 800GB LTO4 tape. Backup is incredibly slow doing a full backup. Pain in the backside getting people to remember to change the tape. Often gets forgotten etc The quick solution: Buy LTO5 Drive and tapes. However this has been turned down because of the cost... What I would like: Something that works in the same way rysnc works. Only changed data is sent over the wire and can be scheduled to run multiple times during the day. Data that is sent is compressed and sent over SSH. Something that keeps a 14day retention but doesn't keep duplicate data So as an example if I have 1TB of working data and 60GB of changes are made each day then I expect around 1.84TB of data to be stored on the offsite server. To work with the Mac OS X server and Centos 6 server. Not cost an arm and a leg. Must be a cheaper solution than buying an LTO5 drive with tapes (around £1500). Be able to be setup to run autonomously. Have some sort of control panel that will allow an admin to easily restore a file/folder. Any recommendations?

    Read the article

  • Wireless to Wireless Transfer Slow on a Linksys WRT54GL

    - by Kyle Brandt
    The Situation: When I try to transfer a file from one computer to another that are both connected via wireless on a WRT54GL (in a office) with dd-wrt firmware I often get bad speeds. In generally they average around 100 kilobytes a second. Either computer can download via wireless from the Internet at at about 2 megabytes a second. The speed is slow with the transfer of one large file. There are about 20 other wireless networks that the computers can see, so there is a lot of noise, but I don't have the equipment to really monitor the frequencies well. But that still seems pretty slow. I thought maybe it was the transmit on each card, but even when they are 5 feet away with a line of sight I still get these speeds. According to Linux both cards are operating at 54g. My Questions: Is this normal for this sort of consumer level wireless equipment? Anything I can do to improve it? why is wireless to wireless transfer slow when everything else isn't? Whats steps might I take to figure out what is happening? For example, are lots of packets not making to the access point requiring retransmissions? Above all, I want to find out what the problem actually is. This may seem odd, but at this point I am more interested in understanding what the problem is than fixing it. What I have tried: I have tried messing with lots of settings. Different channels, xmit power, G-Only, none of which has made anything any better. I've also tried upgrading to newer dd-wrt firmware version and doing a reset to wipe out the settings.

    Read the article

  • Two parts: linux startup script to connect to bluetooth and cron to keep it connected

    - by D.R.
    I have a mini bluetooth keyboard and a Raspberry Pi running a Debian-based distro. I know the MAC address of the keyboard but for this question, let's just use AA:BB:CC:DD:EE:FF. Right now I have to have a wired keyboard connected as well as my bluetooth dongle for the mini-keyboard and on the wired keyboard, I have to run the following when the device boots up: sudo hidd --connect AA:BB:CC:DD:EE:FF If the device goes idle for too long, then the bluetooth disconnects and I have to pull out my wired keyboard and retype that same command. What I'm looking for it a way to have that command run at startup and a way to sense if it gets disconnected so that it will auto reconnect. The annoying thing is that they keyboard has to be in pairing mode (even though it has already been paired) when I run that command, otherwise it tells me the host is down. So perhaps the script needs to prevent it from disconnecting due to inactivity, otherwise I'll have to put it back in pairing mode to reconnect. So to recap: - A script to connect at startup (I can make sure to put the keyboard into pairing mode before turning it on) - A script to prevent it from disconnecting (maybe some sort of signal to send to it every 60 seconds or something?) Any help with this is greatly appreciated! StackOverflow is always the best place to find answers to weird questions! I've been searching long and hard for an answer, but finally had to resort to coming here! Thanks!

    Read the article

  • Disable the user of Internet explorer through policies when called from HTML help

    - by Stephane
    Hello, I have a locked down environment where users are prohibited from doing, well, basically anything but run the specific programs we specify. We just switched a program from using the venerable "WinHELP" help format to HTML help (CHM) but that seem to have an unwanted and rather dangerous side effect: when a user click on a hyperlink inside the HTML help, a new internet explorer window is opened and the user is free to browse and do terrible things to my server (well, not that much, but still...) I have checked the session in this case and the IE window is actually hosted within the help engine: there is no iexplore.exe process running in the user session (and it cannot: it's explicitly prohibited). We have disable all help right now until we find a solution. I'm working with the help team to have all external URLs removed from the help file but that is going to be a long and error-prone task. Meanwhile, I've checked all the group policies option but I have to say that I was unable to find anything that would prevent a standalone IE window hosted in a random process from running. I don't want to disable WinHTTP or the IE rendering engine or anything of the sort. But I need to prevent all users members of a specific AD user group from ever having an IE window displayed to them. The servers are running Windows 2003 and Citrix metaframe 4.5. Thanks in advance

    Read the article

  • Convert a cassette tape recording to digital format

    - by Optimal Solutions
    Has anyone been successful with transferring audio cassette tape recordings to a digital format? I would like to preserve old cassette tape recordings of my grandparents to some digital format: MP3, WAV, etc... The quality of the tapes are mediocre. I think I can handle the quality restoration but getting the audio from tape to digital is my question. Below is a list of the hardware that I can work with: Cassette Deck: I have a Technics stereo cassette deck model RS-B12. It has separate left and right IN and OUT RCA type jacks on the back. In the front it has a headphone phono jack, plus left and right mic input phono jacks. On the computer side: -I have a Windows Vista PC with no additional software other than what came with the machine from Costco. No sound editing software that I can see. There is no sound card on the PC. On the front panel there is a mini-phono mic input jack and there are several different types of in/out mini-phono jacks on the back. In addition, USB and Firewire. I also have access to a new (2009) iMac with a mini-phono input jack for a powered mic or other audio source and GarageBand that has come with the computer. In addition, USB and Firewire. What are my options for getting these cassette recordings into a digital format? Whats the best format? What sort of wires would I need and will I want to utilize the USB or Firewire or can I simply use the audio inputs on the PC (or Mac) to receive the audio stream?

    Read the article

  • WEIRD netstat behavior on Windows XP re: www.partypoker.com

    - by tbone
    I really don't know if this is the right place to ask this, but I would really appreciate if someone that is more savvy on Windows XP (Professional) could help me out. For background, I am a 10+ years programmer, so I'm not a total idiot, but I am far from an expert on TCP/IP, etc, and this has me totally confused. When I do a netstat (on Windows XP) I seem to always get a huge amount of www.partypoker.com connections and I can't figure out where they are coming from. A netstat -o shows me that some are coming from PID xxx, which is firefox, but if I kill it, the connections still remain. Some are coming from PID 0, which makes no sense to me. SECOND PROBLEM: One would think you could edit the C:\WINDOWS\system32\drivers\etc\hosts file to block this, but it seems like my machine is ignoring the hosts file! (I have tried with the DNS client service both enabled and disabled, same result). So I just rebooted, killed all my normal programs, and I can't seem to reproduce the problem. If I was a paranoid person, I would think there was some sort of an intelligent trojan running. I am running Windows XP Pro, Kaspersky Antivirus, ccCleaner, and am fully up to date on Windows Update. What gives???? So, I guess my questions are: 1. Is anyone else seeing these wird connections to partypoker.com? 2. Why isn't my hosts filter working? 3. Is there some utility I can run to find out whats happening? I've tried autoruns.exe from sysinternals but don't see anything interesting. Am I the only one with this problem? If there are any additional things you need me to run, let me know.

    Read the article

  • What could cause TFTP reloaded Cisco `running-config` on 871 to fail?

    - by xtian
    Cisco CCP Write Configuration borked my 871w config while I was trying to setup port forwarding. I went through the basic steps to reconfig the router. I looked to see if I could just reset the router. Nope. I tested the 871's flash memory with fsck to see if there was hardware failure. Nope. Then I rewrote the minimal config for TFTP (which is the same for Cisco's CCP app.). Thne, I successfully uploaded a previously working running-config from Win Vista using SolarWinds TFTP Server, unfortunately the restore was not entirely successful. The old running config was saved to the 871's startup-config and I can login using console port. Some other things that are working are the hostname and welcome message but that's about it. Startup shows an error SETUP: new interface NVI0 placed in "shutdown" state after tftp. The missing light on the access point modem for ethernet link show the 871'a outside FE4 is not working. SO...what's the possible problem with reloading a previously working config (approximately 4 months with the same config) via TFTP? Is there something I can look for on the 871 to verify the config? Or on Vista to validate the config file itself before I transfer it? Or, is this there a common TFTP issue? UPDATE. I missed the instruction from Cisco's TFTP page to delete aaa lines from the config (although a video from a SuperUser user didn't make this point in his most excellent demo). There were several lines of this sort, I deleted them and uploaded again. However, they're back. I assume they're added automatically? [nope, see answer] UPDATE 2. The reload of previous settings was successful, but this error remains. I don't even know now if it was there before or not. It seems irrelevant to the question.

    Read the article

  • How would I force Debian to use the physical sector size on a hard disk?

    - by Confused User
    I just purchased a few new 3TB WD drives. These have physical 4k sectors, but there is some sort of layer which is providing 512B logical sectors (see the partition table below). In order to attempt to get some more speed out of my hard drives, I would like to get rid of this logical layer and actually use the physical 4k sectors. However, I can't figure out how to do this (or even if it's possible) from the man pages of fdisk and parted, or from searching Google. Does anybody know how this could be done? As to why this is relevant, this page demonstrates that meerly aligning the sectors properly can already make up to a 25% speed difference for reads, and more than 2500% for writes in some cases! Getting rid of the logical sectors in favor of the physicals ones should improve speeds even more. Thanks! $ parted /dev/sdc GNU Parted 2.3 Using /dev/sdc Welcome to GNU Parted! Type 'help' to view a list of commands. (parted) print Model: ATA WDC WD30EZRX-00M (scsi) Disk /dev/sdc: 3001GB Sector size (logical/physical): 512B/4096B Partition Table: gpt Number Start End Size File system Name Flags 1 1049kB 3001GB 3001GB zfs 9 3001GB 3001GB 8389kB P.S. I don't care about the data on the drives, I was just playing with different file systems. Also, this is my first time posting here, so please let me know if my posts should be formatted differently, etc.

    Read the article

  • How to Load Balance 2 Internet Connections on a Windows 7 machine?

    - by Jimmy Chandra
    It's sort of related to this particular question, but that one is on Mac. I am looking for similar solution on Windows 7. I have 2 network connections: (Connection A) Wireless terminal connecting to ISP A (3G / EVDO internet provider) (Connection B) Broadband wired connection connecting to ISP B (Cable internet provider) Both has access to the internet. When I try connecting to a website and checking the networking tab on my Task Manager, I only see the network traffic being routed to only Connection A. Is there a way to make the computer to utilize both network (in a sense using all the bandwidth available from both the Cable ISP and the 3G / EVDO ISP) at the same time? If so, what do I need to do to set this up ... on Windows 7? Here is a bit more info on my network connections (ipconfig /all): PPP adapter Wireless Terminal: IPv4: aa.bb.ccc.ddd(preferred) Subnet mask: 255.255.255.255 Default Gateway: 0.0.0.0 DNS: aa.ee.f.ggg aa.ee.f.hhh Primary Wins: jjj.ii.k.l Secondary Wins: jjj.ii.k.m Ethernet adapter LAN: IPv4: 192.168.1.100 (connected to a router by wired that itself connect to a cable modem) subnet mask: 255.255.255.0 Default gateway: 192.168.1.1 (the wireless router) DHCP: 192.168.1.1 (the wireless router) DNS: xxx.yy.zz.ww rr.sss.t.uuu For my own privacy, I don't believe the actual number matters, the patterns are representative of the ip numbering scheme...

    Read the article

  • What's next for all of these Microsoft "overlapping" and "enhanced" products ?

    - by indyvoyage
    Recently I attended a road show, organised by MS Gold Partner company in the UK. The products discussed were: SharePoint server (2010 and 2007), Exchange server, Office Communication Server 2007, Exchange hosted services Office Live meeting, Office Communicator, System Center Configuration Manager and Operation Manager, VMware, Windows 7 etc. As Microsoft claims the enhancement in the each product against higher version, I felt that clients are not much interested in all these details. For example Office Communicator, surely they have improved a lot the product and first site all said 'WOW' great product, but nobody wish to pay money for all these extra features. Some argued, they are bogged down by all these increased number of menus. They don't need soft call feature included with mobile call. It apply for all other products as well such as MS office (next what 2 ribbons ?), windows OS and many more. Indeed there must be good features in all these products, but is it worth to spend money and time to update the older system ? Also sometimes these feature will decrease the productivity instead increase it. *So do you think what ever enhancement MS is doing in the products is only for selling purpose, not a real use ?? and I think also keep the developer busy learning the new tools and features. * I am sure some some people here will argue that some people need this sort of features. But I am not talking about NASA or MI5 guys. I am talking of usual businesses and joe public. Any ideas welcome.

    Read the article

  • Backup plan for linux webserver in small business?

    - by radman
    Hi, I am currently in the process of writing a backup plan for the webserver in use by my business. I am very new to this area and have a few ideas about how things should work but am unsure of what tools to use and what sort of restore process is appropriate. I'm looking for something relatively simplistic and it doesn't have to be 100% paranoid just enough to give me a reliable backup. Speed is not of the essence and there is not going to be a live fallback in place. The backup will be onto a single hdd that will be stored onsite (no option for offsite as yet). Backups will be taking place weekly. I am constrained by both time and money which is why I'm aiming for a good enough solution. Is taking an image of the webserver system drive periodically and using that as the backup appropriate? Should I be testing that the backups restore correctly every time that I perform one? This is a bit broad but what setup would you use if you were in my place, given the services I am running? Should I add additonal machines and split the services? Any advice is much appreciated! See below for server details Webserver Platform Linux Ubuntu server Running mail-server svn-server mediawiki wordpress apache-webserver Hardware single 500gb sata drive Architecture Single machine behind router (with firewall) accessible to the internet.

    Read the article

  • Loss of network connectivity when playing video on Optoma HD180 projector

    - by Jeff Fohl
    Hi Folks - New to Super User, so I hope this question fits in with the guidelines. Very strange problem I am having, and I am at a loss as to how to continue troubleshooting this one. The basic problem is that when I attempt to watch streamed video on a particular display device (an Optoma HD180 projector), my network connectivity drops like a stone to barely measurable levels. This is my setup: I have a Dell H2C 730x running Windows 7 64bit. This particular computer has two ATI Radeon HD 4800 video cards. I have two Samsung 22" monitors connected to one card, and an Optoma HD180 digital projector connected to the other card via an HDMI cable. My internet connection is normally a reliable 6Mbps. The problem I am having occurs when I stream video (or even just browse the web) on the Optoma Projector. When I do this, my internet connection drops to practically zero (just a few kilobits per second). When I move the browser away from the projector, and over to one of my Samsung monitors, the internet connection comes right back. Note that the Optoma projector is on and enabled as a third monitor all this time. I can move the mouse around on the projector without triggering the problem. I tried pinging my router when I was playing a movie on one of the monitors, and I get a 1 millisecond response. However, when I have the movie playing on the Optoma projecter, pinging the router gives me response times in the hundreds of milliseconds, or times out completely. So, it clearly is something local to my machine - and not some sort of throttling occurring down the line. I would think that it is possibly something to do with the HDMI driver conflicting somehow with my network driver (which is a USB-based wireless connection). This one has me really stumped. Anyone have any ideas?

    Read the article

  • Estimating compressed file size using a list parameter

    - by Sai
    I am currently compressing a list of files from a directory in the following format: tar -cvjf test_1.tar.gz -T test_1.lst --no-recursion The above command will compress only those files mentioned in the list. I am doing this because this list is generated such that it fits a DVD. However, during compression the compression rate decreases the estimated file size and there is abundant space left in the DVD. This is something like a Knapsack algorithm. I would like to estimate the compressed file size and add some more files to the list. I found that it is possible to estimate file size using the following command: tar -cjf - Folder/ | wc -c This command does not take a list parameter. Is there a way to estimate compressed file size? I am also looking into options like perl scripts etc. Edit: I think I should provide more information since I have been doing a lot of web search. I came across a perl script(Link)that sort of emulates the Knapsack algorithm. The current problem with the above mentioned script is that it splits the files in their original state. When I compress the files after splitting them, there are opportunities for adding more files which I consider to be inefficient. There are 2 ways I could resolve the inefficiency: a) Compress individual files and save them in a directory using a script. The compressed file could provide a best estimate. I could generate a script using a folder of compressed files and use them on the uncompressed ones. b) Check whether the compressed file's size is less than the required size. If so, I should keep adding files until I meet the requirement. However, the addition of new files to the compressed file is an optimization problem by itself.

    Read the article

< Previous Page | 264 265 266 267 268 269 270 271 272 273 274 275  | Next Page >