Search Results

Search found 10543 results on 422 pages for 'big bang theory'.

Page 207/422 | < Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >

  • Checking for orphaned snapshots - ESXi5

    - by Tim Alexander
    So we had some issues with our passive mail node over the weekend doing vmtools updates and to resolve a problem we had to revert to a snapshot and then reseed all the databases across. All in all everything seemed fine, the server works and CCR copy status is running fine. I used the "Delete All" option this morning to remove the snapshot and the process according to vCenter has completed with no errors and no "Needs Consolidation" flag. This all seems fine until I check the Datastore that holds the VM on our SAN and I can clearly see snapshots that are pretty big [see attached image]. These do not seem to be changing size and the data modified is around the time the works were started for the vmtools update. Does this possibly mean that at some stage, possibly during reversion or hard resetting of the VM, that they have become orphaned? Are there any methods to check orphaned status of snapshots? We are running ESXi5.0 Update 1 with storage provide by an EMC SAN. Enterprise plus is the license level.

    Read the article

  • Limit disk I/O one program creates?

    - by Posipiet
    Hardware: one virtualization server. Dual Nehalem, 24GB RAM, 2 TB mirrored HD. Software: Debian, KVM, virt-manager on the server with several virtual machines that use Linux too. 2 TB Disk is a big LVM, each VM gets a logical volume and makes its own partitions in that. Problem: One of the programs that runs on one of the VMs creates huge disk load. This never was an issue, because the program never ran on such a powerful hardware. Now the CPUs are fast, and lots of I/O is the result. We cant do much against that at the moment, because the tool is a black box. On the other hand, the speedy computation is welcome. The program creates about 5 GB of temp files which get overwritten during the next iteration. Question: How can we limit the disk I/O for the process?

    Read the article

  • SQL Server 7 Transaction Logs Issues

    - by nate
    Over the week my database server transaction log was full. With our app people could select from the database but could not update or insert into the database. In the past we have just truncated the transaction logs. After that, everything was back to normal. This week I truncated the transaction logs, and shrink that database. Now we can select, update, and insert into the database. The only issue is when we do a big job, and to a lot on inserting or updating, we get the following error: Database error: S1008:[Microsft][ODBC]Operation canceled We never had this issue before, I am assuming the that is the same as a timeout error. Has anyone else had this issue before, or know how I resolve this?

    Read the article

  • Ubuntu Lucid: Erratic screen behaviour after boot

    - by fgysin
    In short: about 50% of the time I have a screwed up monitor setup after reboot. About 50% it is totally correct. Now the longer version: I updated my machine from 9.04 to 10.04 (via 9.10). At first I run into some monitor problems (I have a 3-monitor setup) because of the known bug in the new xserver driver for xinerama. This messes up behaviour if the mouse goes either left or above the screen number 0, i.e. I had to make my left-most monitor screen 0. Everything worked out fine finally, I got my 3-monitor setup back with xinerama enabled to get one big desktop streched over 3 screens. Now the fun part: Every time I start up my machine only one of the 3 monitors gets a signal and is woken up: it only recognizes the left-most monitor (screen 0) and crams all the desktop stuff into this one screen. If I go into nvidia settings I only see one physical device although all 3 are connected and have power. When I look into the xorg.conf I can still see my old setup with 3 devices, 3 screens, xinerama active etc... But I was totally unable to get 3 montitors to work. (I tried unplugging monitors, reconfiguring whole nvidia setup, ...) But it gets even better: When I restart my machine (i.e. choose the restart option from the Ubuntu menu) it shuts down and tries to restart. The restart then gets stuck after showing the Ubuntu splash screen with the 'loading bar' (the moving dots thingy) and I am forced to kill the machine by cutting power. But after the power cut the machine boots up normally and suddenly I get my 3 monitor setup back up working. That is until the next time I shut down and start up, where it all starts over again and I only have one monitor... (see above) I really have a hard time seeing where the error is. It must be that the restart boot somehow differs from the 'normal' boot. But the fact that it gets stuck and I need to cut power which then basically triggers a 'normal' boot does not really support this theory... My setup (please tell me if you need further info): 3 monitors as 3 screens as one desktop (with xinerama) 2 nvidia cards where screen 0 and 1 are on card 0 and screen 2 is on card 1 Ubuntu 10.04 Lucid Lynx (updated from 9.10, 9.04, ....) I would appreciate every idea on the subject, at the moment I really don't have any clue what to do...

    Read the article

  • Unable to Boot from USB External Hard Drive

    - by Josh Stodola
    I recently upgraded my main development machine to Windows 7. This involved wiping out my primary boot drive (Windows XP 64-bit) and starting clean. Before I wiped it, I did a direct disk-to-disk copy to a big external hard drive I have. While I have been able to migrate most of the necessary files without any problems, I was wanting to boot from it today to check a few settings. I plugged in the hard drive, rebooted, changed the BIOS to boot from USB-HDD first. But, no mattter what I do, it always boots from my primary drive to Windows 7. I do not see any kind of error message or anything. How can I boot to Windows XP 64-bit on this external hard drive?

    Read the article

  • Can Apache 2 be configured to start sending gzipped data early?

    - by rikh
    We have Apache set up to gzip compress html pages before they are sent to the client browser. However, some of our pages are slowish to generate and it seems that Apache is holding on until it has the complete page, compressing it, then sending it to the browser. There are big chunks of the page (the main important bits) that are actually generated and output fairly quickly. Is it possible to configure Apache to start compressing and send data for the page as soon as the script starts outputting something? Is it is, can you offer any help is how to do this? If not, can you suggest any other way to get gzip compression working for the server? The scripts that generate the pages are written in PHP. We are using Apache 2.0 on Linux.

    Read the article

  • Revo 3610 not doing hdmi handshake

    - by DoomStone
    I am having a problem with my Revo 3610 witch is connected to my tv via hdmi. For some reason will it not do the hdmi handshake with the tv, so my tv does not think that there are anything in the hdmi port. I have tested the tv and it works find, with my laptop and dvd. It dose work some times, but this time have it failed for 2 days in a row, and i have tried rebooting, turning the tv off and on, and so on nothing helps. I can trick the TV to listen to the HDMI with connecting with my laptop and then change the hdmi back to my revo, this on the other hand results in the image going thoug nicely but there are a big fat "Check signal cable." on the screen. I have also tryed changing the resolution in the revo but this dose not help ether. Have any one had this problem before, and if so how did you fix it? Example: http://i.imgur.com/gguZ4.jpg

    Read the article

  • MySQL keeps crashing OS server.. Please help adjust my.ini!

    - by TruMan1
    I have MySQL 5.0 installed on a Windows 2008 machine (3GB RAM). My server crashes on a regular basis (almost once a day) with this error: Changed limits: max_open_files: 2048 max_connections: 800 table_cache: 619 I did not use the heavy InnoDB .ini file, although I am rethinking that I should have? I am worried that big configuration changes will make my current sites stop working. What should I do? Here is my current ini settings: default-character-set=latin1 default-storage-engine=INNODB max_connections=800 query_cache_size=84M table_cache=1520 tmp_table_size=30M thread_cache_size=38 myisam_max_sort_file_size=100G myisam_sort_buffer_size=30M key_buffer_size=129M read_buffer_size=64K read_rnd_buffer_size=256K sort_buffer_size=256K innodb_additional_mem_pool_size=6M innodb_flush_log_at_trx_commit=1 innodb_log_buffer_size=3M innodb_buffer_pool_size=250M innodb_log_file_size=50M innodb_thread_concurrency=10

    Read the article

  • Which CMS for a mobile app? No HTML, just XML or JSON.

    - by Sascha
    I am a newbie in content management systems. I would need a CMS that can transfer content by XML or JSON to a client. It is ok if the CMS can also manage HTML websites, but the priority is on the data transfer over a web service. Which is the best CMS to use here? I want to avoid spending endless hours learning all the big CMS systems just to find out that they don't support this feature or that it's badly integrated. Thanks.

    Read the article

  • Scanned JPEGs are large and slow to load - can they be optimized losslessly?

    - by Alistair Knock
    I have hundreds of JPEG photographs which were scanned about 5 years ago from negative using a Konica Minolta DiMAGE Scan Dual IV. The dimensions are ~4500x3000, and the filesize is around 12Mb, compared to shots from a DSLR with dimensions of 3000x2300 and filesize of 2-4Mb (actually, these are the output from a RAW convertor). The filesize is obviously quite a big difference, but the issue that's bothering me is that the (perceived) loading time is at least 10 times slower. Is this size/speed discrepancy likely to be because the scanner software saved the JPEGs inefficiently / using an old compression format, or is it simply that the scanned negatives contain much more "detail" (in the form of grain/noise) than the digital images? If the former, is there a way to losslessly optimize them? I've tried re-exporting the scanned files to full size JPEG from my RAW software but the filesize is pretty much the same. Both files will have been saved at 100 quality.

    Read the article

  • Streaming media from linux server - low footprint is crucial

    - by Mike Haye
    I recently pre-ordered the Raspberry Pi. http://www.raspberrypi.org/faqs For those of you who don't know it, it's a machine with 256 mb ram and a 700 MHz processor for $35. I plan to run linux on an SD card on this machine and have it act as both a htpc, VPN and media server. In regard to the media server part, I need to find some linux software that has a small footprint, but allows me to stream media to other devices connected to the internet (preferably without having to install any additional software on the client machines) Also, I would love if the video could be compressed, so the data usage wouldn't be so big for the client machine (e.g. when I'm using my data plan on my smartphone ;) ) Thanks in advance for any answers :) Mike.

    Read the article

  • Is there a stable Linux distro using btrfs?

    - by chrish
    I'm a big fan of ZFS on FreeBSD (I've been using it on my home server since before it got stable; bleeding edge, baby!) and I'd like to try out btrfs to see how that's evolving. Since it's still largely in development, none of the usual mainstream distros have btrfs as an option. I haven't used Linux in a bunch of years, so I don't really know what my best options are for giving btrfs a try. Requirements: easy to install btrfs supported without requiring me to rebuild the kernel Thanks!

    Read the article

  • How can I connect to a CIFS/SMB share on a non-default port?

    - by fsckin
    I'm trying to get a contractor connected to a CIFS share (port 445). He's not a big shop (so no go on using VPN). His ISP blocks outgoing connections on port 445. I've been doing some rsync to ftp madness as a workaround to have the share available to him, but it's getting out of control -- we're syncing nearly 40GB a day to an external ftp site and it's going to be much easier just to have him connect and only grab the stuff he needs. So... I can have the CIFS share open to the internet (filtered to allow access to his IP only) on port 446. How the heck can he connect to that? I looked through "net use" and didn't see anything about using another port.

    Read the article

  • periodically overridding NTP for simulation purposes

    - by Gerard
    I have this situation: NTP is used to sync time on a set of Windows 7 and Server 2008 machines. Nothing out of the ordinary about this. periodically on this system, the time needs to be changed for testing/training purposes (it is a training simulation system that has a lot of time-dependent operations). My question: As NTP in general does not really like big time jumps or changes AFAIK, is there a standard way this could be set up to allow the clock to be changed at the root NTP server in the system and have it propagate through the system in a reasonable amount of time (a minute or two?) It is not acceptable to disable and/or restart all NTP client services to achieve this. Any ideas? It would be nice to do this without writing some kind of custom script to disable services and update clocks all over the place. Thanks in advance.

    Read the article

  • Postfix sends email to spam (gmail, hotmail)

    - by razorxan
    I recently installed a postfix + dovecot + dkim multi domain, multi user, multi alias mail server on my debian squeeze system. Everything works except for one big issue that basically makes the whole thing useless: Every single email sent by my server goes straight into spam. (gmail, hotmail) First thing i did is doing the well known allaboutspam test and all is checked (green) except for the BATV thing (yellow): Reverse dns: green HELO Greeting: green RBL: green BATV: yellow SPF: green DKIM: green URIBL: green SPAMAssassin: green Greylist: green I'm really confused and i can't see a way to solve this issue. Ask me any detail if you need.

    Read the article

  • RDP for High DPI Monitors?

    - by Joey
    A client is having some problems with their laptop. They use RDP to remote into their work PC, but the laptop they are using is a small 13" Sony Vaio laptop, but with 1920x1080 resolution. Everything is pretty small on the laptop anyway, but the problem is much worse after connecting with RDP, where everything is almost unreadable. I have done the obvious with changing the resolution on the server, the RDP size, forced scaling on the terminal server etc, but nothing has worked. Something else which I would normally do is change the laptop resolution to something a little lower, but the laptop only has 2 resolution settings, the big one, and a 1024x768 (wrong ratio). Any ideas?

    Read the article

  • SQL Server Transaction Log RAID

    - by Eric Maibach
    We have three SQL Server servers, and each server has a about five or six databases on it. We are in the process of moving these servers to a new SAN and I am working on the best RAID configuration. Currently all of the log files for all of the databases share a RAID array, there is nothing else on this RAID array except for the log files, but all of the databases use this same array for their log files. I have read that it is best to have log files on separate disks. But in our case I am not sure whether it would be best to have one big array with about 8 drives that all the log files are on. Or would it be better to create four two disk arrays and give some of the larger databases their own dedicated disks for their log files?

    Read the article

  • Reliability of S.M.A.R.T.?

    - by Mark
    I've been using ActiveSmart to monitor my hard-drives health for a few weeks now, and its telling me my brand new 1.5 TB hard-drive is half-dead already. About on-par with one of my hard-drives which I know is at least half dead because I've been having some read errors and heard ticking noises. Now I haven't actually noticed any problems with my 1.5 TB drive; should I be concerned that it's going to crap out on me too? Or could ActiveSmart be giving a mis-diagnosis because I use it a lot or something (I've used up 795 GB in the 2 and a half weeks I've had it). The events that ActiveSmart has been catching is "Hardware ECC recovered". Maybe these new fangled super big hard-drives somehow rely on ECCs to squeeze out the extra space, but this isn't actually a cause for concern?

    Read the article

  • Power User - archive Outlook mail items into SQL Server

    - by marc_s
    I am looking (and so far not finding any) for a solution to archive e-mail items from my Outlook into SQL Server. My PST is beginning to get really really big, and I'd love to extract my older e-mail into SQL Server in a way so I can still easily find mails if needed. I would prefer SQL Server as the storage medium since I'm familiar with it, and it's rock solid - I don't want to have a collection of PST files or CHM files or anything like that. Does anyone know of such a solution? I'm a power/home user - I can't afford $5'000 enterprise licenses - I need a sub-$100 solution for private use.

    Read the article

  • Unix: Sync directory with FTP or SFTP directory

    - by Svish
    I have a website on my local computer running Mac OS X. I am wondering if there is any built-in command that I can run in the Terminal that will upload that website to my webserver either through FTP or, if possible, SFTP. Installing new commands through MacPorts is also a possibility. A big bonus would be that it only uploaded the files that needs to be updated and not everything else. It would also be nice if I can tell it to delete the files on the server that no longer exists locally once in a while. Any good tips?

    Read the article

  • Replacing files in a folder structure with files from an unsorted folder

    - by Andrew
    I have over 50,000 PDFs organized into folders in a file called PDFACT. I needed to compress these files so I ran them through Adobe to batch compress them and this worked—except Adobe could only output the files without their folder structure. So basically I have 50,000 PDFs set up in a folder with hundreds of subfolders, and everything was organized. I ended up with one folder with 50,000 compressed PDFs in it, just in alphabetical order. Somehow I need to replace all the orginal PDFs with their compressed copies. Let me give an example: In the folder PDFACT we have the following file: C:\PDFACT\BIG DINNER\BILL\NEWESTBILL.PDF … and in the output folder that Adobe created we have just: C:\COMPRESSED_PDF_FOLDER\NEWESTBILL.PDF This copy is smaller then the one in PDFACT and has the same name but it is just lumped in with every other PDF. The folder structure and subfolders are gone. Is there any way to replace all the larger uncompressed PDFS inside the orginal folder structure with their now compressed counterparts?

    Read the article

  • Disk space profiling in Unix

    - by user1677770
    I'm looking for a tool to summarize how disk space is being used on very large partitions. Our file system is around 950TB, mostly broken up into 20TB partitions. There are some really nice graphical tools for visualising these file spaces: http://www.disksavvy.com/disksavvy_screenshots.html http://methylblue.com/filelight/ But I'm really not sure how well they will scale. Does anybody have any experience of these tools and can make any recommendations? Even something that parses and summarises a really big du output would be a good start.

    Read the article

  • How can I receive more traffic? My VPS fails!!!

    - by qtrix
    I have a web site - photo gallery. About 400 photos. Site on Gallery 3. mySQL. Hosted on VPS from myhosting.com (CPU 1792 MHz, 2048 MB RAM). Everything seems to be ok, but there is one big problem. Once traffic reaches ~ 20 people (online) - website start loading really really slow. Actually website can't be loaded about 30-60 sec. What should I do? Buy more RAM / CPU on the same VPS? Move to a dedicated server or maybe myhosting.com just sucks? What do you recommend?

    Read the article

  • How does Outlook handle old recurring reminders?

    - by Zian Choy
    Context: Windows 7 Ultimate 32-bit edition Microsoft Office 2007 Outlook Steps to Reproduce: Make an event that recurs once a week. Wait a week. See that it pops up OK. Wait a month. Notice that it doesn't say that it is a month overdue. Expected Result: The usual note that the reminder is [x] weeks overdue. Actual Result: Something like "6 days overdue". Possible Excaberating Issue: I have many overdue reminders. For the ones that aren't time critical (and all other things being equal), I work by category and age. For example, I do health-related reminders when I'm doing health stuff; if I have 2 health-related reminders, I do the older one first. Big Question: How is Outlook supposed to handle this sort of overdue recurring reminder? Is there any way to get Outlook to act the way I expect it to?

    Read the article

  • Network monitoring solution

    - by Hellfrost
    Hello Serverfault ! I have a big distributed system I need to monitor. Background: My system is comprised of two servers, concentrating and controlling the system. Each server is connected to a set of devices (some custom kind of RF controllers, doesnt matter to my question), each device connects to a network switch, and eventually all devices talk to the servers, the protocol between the servers and the devices is UDP, usually the packets are very small, but there are really a LOT of packets. the network is also somewhat complex, and is deployed on a large area physically. i'll have 150-300 of these devices, each generating up to 100+ packets per second, and several network switches, perhaps on 2 different subnets. Question I'm looking for some solution that will allow me to monitor all this mess, how many packets are sent, where, how do they move through the network, bandwidth utilization, throughput, stuff like that. what would you recommend to achieve this? BTW Playing nice with windows is a requirement.

    Read the article

< Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >