Search Results

Search found 3112 results on 125 pages for 'daylight saving'.

Page 88/125 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • Series On Embedded Development (Part 2) - Build-Time Optionality

    - by user12612705
    In this entry on embedded development, I'm going to discuss build-time optionality (BTO). BTO is the ability to subset your software at build-time so you only use what is needed. BTO typically pertains more to software providers rather then developers of final products. For example, software providers ship source products, frameworks or platforms which are used by developers to build other products. If you provide a source product, you probably don't have to do anything to support BTO as the developers using your source will only use the source they need to build their product. If you provide a framework, then there are some things you can do to support BTO. Say you provide a Java framework which supports audio and video. If you provide this framework in a single JAR, then developers who only want audio are forced to ship their product with the video portion of your framework even though they aren't using it. In this case, support providing the framework in separate JARs...break the framework into an audio JAR and a video JAR and let the users of your framework decide which JARs to include in their product. Sometimes this is as simple as packaging, but if, for example, the video functionality is dependent on the audio functionality, it may require coding work to cleanly separate the two. BTO can also work at install-time, and this is sometimes overlooked. Let's say your building a phone application which can use Near Field Communications (NFC) if it's available on the phone, but it doesn't require NFC to work. Typically you'd write one app for all phones (saving you time)...both those that have NFC and those that don't, and just use NFC if it's there. However, for better efficiency, you can detect at install-time if the phone supports NFC and not install the NFC portion of your app if the phone doesn't support NFC. This requires that you write the app so it can run without the optional NFC code and that you write your install app so it can detect NFC and do the right thing at install-time. Supporting install-time optionality will save persistent footprint on the phone, something your customers will appreciate, your app "neighbors" will appreciate, and that you'll appreciate when they save static footprint for you. In the next article, I'll talk about runtime optionality.

    Read the article

  • "shutting down hyper-v virtual machine management service"

    - by icelava
    I have a Windows 2008 R2 server that is a Hyper-V host (Dell PowerEdge T300). Today for the first time I encountered an odd situation; i lost connection with one of the guest machines but logging on physically it seems the guest OS is still running but no longer contactable via the network. I tried to shut down the guest machine (Windows XP) but it would not shut down, getting stuck in a "Not responding" dialog box that cannot be dismissed. I used the Hyper-V management console to reset the machine and it could not get out of resetting state. I tried to save another Windows 2003 guest machine, and it would be progress with its Saving state (0%). The other running Windows 2003 guest was stuck in the logon dialog. My first suspicion is perhaps one of the Windows update patches this week (10 Nov 2011) may something to do with it, which was still pending a system restart. Well, since I could not do anything with Hyper-V i proceeded with the Windows Update restart, and now it is stuck half an hour at "Shutting down hyper-v virtual machine management service" Prior to restarting I did not observe any hard disk errors reported in the system event log; doubt it is a disk-related condition. Shall I force a hard reboot? UPDATE As per answer report, it eventually restarted itself.

    Read the article

  • Convert Custom Firefox Setup to Firefox Portable?

    - by dfree
    I have a pretty awesome firefox set up and spent a lot of time getting it perfect. Is there any way that anyone knows about to convert the entire configuration to portable? Programs like MozBackup are great for backing up the complete set up, but you can't restore a Firefox profile to Firefox portable (maybe there is a workaround to fake it out? or possibly another method?) In case anyone is interested here is the gist of the best add-ons I've found: Autopager (scroll down google and other multi page results without clicking next) Coral IE Tab (IE in firefox - in case a website 'insists' that you use IE) Cyber search (search google straight from the address bar - VERY HELPFUL) Download StatusBar (display progress of downloads in the bottom of ff - no annoying popups FireFTP (erases need for an external FTP client - opens in a tab) Gmail manager (if you use multiple gmail accounts) Session Manager (saving multiple sessions of tabs - ff session recover) Surf Canyon (pull relevant stuff out of the depths of search results - even from craigslist Tab Mix Plus (ESSENTIAL - tab behavior customization - have multiple rows of tabs I also have it set up so you can type 'g test' in the address bar and ff will pull up the google results for 'test'. Similarly have it set up for guitar tabs (tab), facebook (f), wikipedia (w), google maps from my house (gmhome), torrents (tor), ticketmaster (t), rotten tomatoes (rt), craiglist (c) plus about 20 other sites.

    Read the article

  • Connection speed drops from 1 Gbps to 10 Mbps (Vista 64)

    - by Kevin Hakanson
    I recently got a Windows Home Server (HP MediaSmart Server EX490) setup so I could do backups and other things. However, I am having trouble on my Vista 64 PC. The backup will be making great progress, then it will just slow down. At one point, I noticed the lights on my Netgear GS105 indicated it was not using a 1000 Mbps connection, but a 10 Mbps one. I checked the Status of Local Area Connection (Intel(R) 82567V-2 Gigabit Network Connection) and that also showed the same slow speed. This has happened several times in the last couple days. When I disabled the network device, and then enabled it, it established the 1 Gpbs connection again. However, some of the times the Sent Bytes Activity on the Status windows indicate that the data flow is still slow (100 to 1000 bytes every couple seconds). Obviously, at this rate I could backup faster to floppy disk. :) My question is how to diagnose and fix this problem. When I look at the Administrative Events, I do a Errors: Bonjour Service 456: ERROR: read_msg errno 10054 (An existing connection was forcibly closed by the remote host.) And a Warning: e1yexpress Intel(R) 82567V-2 Gigabit Network Connection Link has been disconnected. I am suspicious there is some power saving mode. I found a post suggesting System Idle Power Saver(SIPS) may be the issue. I am going to try that, but looking for other suggestions or diagnostic advice. I have several new items in this configuration: server, client software, switch and cat6 cables.

    Read the article

  • Can compressing Program Files save space *and* give a significant boost to SSD performance?

    - by Christopher Galpin
    Considering solid-state disk space is still an expensive resource, compressing large folders has appeal. Thanks to VirtualStore, could Program Files be a case where it might even improve performance? Discovery In particular I have been reading: SSD and NTFS Compression Speed Increase? Does NTFS compression slow SSD/flash performance? Will somebody benchmark whole disk compression (HD,SSD) please? (may have to scroll up) The first link is particularly dreamy, but maybe head a little too far in the clouds. The third link has this sexy semi-log graph (logarithmic scale!). Quote (with notes): Using highly compressable data (IOmeter), you get at most a 30x performance increase [for reads], and at least a 49x performance DECREASE [for writes]. Assuming I interpreted and clarified that sentence correctly, this single user's benchmark has me incredibly interested. Although write performance tanks wretchedly, read performance still soars. It gave me an idea. Idea: VirtualStore It so happens that thanks to sanity saving security features introduced in Windows Vista, write access to certain folders such as Program Files is virtualized for non-administrator processes. Which means, in normal (non-elevated) usage, a program or game's attempt to write data to its install location in Program Files (which is perhaps a poor location) is redirected to %UserProfile%\AppData\Local\VirtualStore, somewhere entirely different. Thus, to my understanding, writes to Program Files should primarily only occur when installing an application. This makes compressing it not only a huge source of space gain, but also a potential candidate for performance gain. Testing The beginning of this post has me a bit timid, it suggests benchmarking NTFS compression on a whole drive is difficult because turning it off "doesn't decompress the objects". However it seems to me the compact command is perfectly capable of doing so for both drives and individual folders. Could it be only marking them for decompression the next time the OS reads from them? I need to find the answer before I begin my own testing.

    Read the article

  • Downloading a file from the internet with '&' in URL using wget

    - by matt_tm
    Hi, I'm trying to download a file from a URL that looks like this: http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf Within the browser, this link prompts me to download a file called x.pdf irrespective of what DEF is (but 'x.pdf' is the right content). However using wget, I get the following: >wget.exe http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files\GnuWin32/etc/wgetrc --2011-01-06 07:52:05-- http://pdf.example.com/filehandle.ashx?p1=ABC Resolving pdf.example.com... 99.99.99.99 Connecting to pdf.example.com|99.99.99.99|:80... connected. HTTP request sent, awaiting response... 500 Internal Server Error 2011-01-06 07:52:08 ERROR 500: Internal Server Error. 'p2' is not recognized as an internal or external command, operable program or batch file. This is on a Windows Vista system Edit1 >wget.exe "http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf" SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files\GnuWin32/etc/wgetrc --2011-02-06 10:18:31-- http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf Resolving pdf.example.com... 99.99.99.99 Connecting to pdf.example.com|99.99.99.99|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 4568 (4.5K) [image/JPEG] Saving to: `filehandle.ashx@p1=ABC&p2=DEF.pdf' 100%[======================================>] 4,568 --.-K/s in 0.1s 2011-02-06 10:18:33 (30.0 KB/s) - `filehandle.ashx@p1=ABC&p2=DEF.pdf' saved [4568/4568]

    Read the article

  • Open Office crashes, recovers, crashes again

    - by Daniel R Hicks
    After completely reinstalling my laptop due to apparent registry corruption, I've encountered a problem with Open Office: I open a simple Calc spreadsheet, it comes up normally, but then after anywhere from 5 seconds to several minutes (without even touching the Calc window) OO crashes, then comes up through recovery. If I let it "recover" it will do so and bring the spreadsheet up again, only to repeat the crash scenario again. If I kept clicking "OK" it would apparently do this all day. I reinstalled OO once and the problem went away for awhile, but it came back. I then attempted to "reset" my profile (ie, rename the OO user directory in App Data), but OO crashed during the first startup after that, then resumed the original behavior. If I open the same file using Excel it complains of errors in the file, and "recovers" them, but the "error report" it generates contains no details. If I save the "recovered" file then OO Calc will open it, but the problem returns after saving again. Any ideas? (The system is Vista SP2, running OO 3.4.1) How to reproduce: Start Open Office Calc. Save workspace as "CrashTest.ods" From Task Manager kill Open Office (soffice.exe/bin -- one of each) Double click on the saved "CrashTest.ods" in Explorer. OO puts up a message that recovery will occur -- allow it. When the Calc window comes up, don't touch it -- just wait about 10 seconds. Calc window closes and OO puts up a message that recovery will occur -- from now on the sequence will repeat. I suspect this behavior is limited to a few (recent) versions of OO, and very possibly only Calc. Reported as Open Office Bug 1211094. Sigh!! As much as it irritates me, I'm having to switch over to Excel for several things I used to do with Calc. Excel has a miserable UI, but at least it says up for longer than 10 seconds.

    Read the article

  • Deleted, then added user w/ same name, now logs on w/ temp profile

    - by labyrinth
    I am a new admin at a high school lab and am trying to spearhead separation of normal IT accounts from IT admin accounts. I made my normal account (e.g. ITuser) and an admin account (e.g. ITuser-adm) on the server (Win Server 2008 R2). I used both accounts on my my main desktop for about a day, but decided I hadn't set up the admin account correctly. I deleted the my admin account, then made a new one with the same name. The problem is that on my main desktop (Windows 7 Pro), whenever I log in with my admin account, it gives the following errors: Windows has backed up this user profile. Windows will automatically try to use the backup profile the next time this user logs on. (Error 1515) Windows cannot find the local profile and is logging you on with a temporary profile. Changes you make to this profile will be lost when you log off. (Error 1511) This is more of a nuisance than anything for me, I just thought I could use the same name for a user account I'd just deleted since they would have separate SSIDs anyway. If it's less trouble, I could just make a new admin account. Or I could just keep using it as is since I don't need to be saving anything locally anyway and the typical folder redirects work fine. I'm just curious and want to understand what's going on. There are no errors listed regarding the registry.

    Read the article

  • How to diagnose disk errors when disk appears to be ok?

    - by Kylotan
    I have a six-month-old 1TB Seagate drive formatted into 2 NTFS partitions, and the disk appeared to be failing with Windows dropping down from UDMA to PIO mode, reporting Delayed Write Errors, and hanging Explorer when browsing directories. My initial suspicion was that the disk was dying. However, on further examination it appears that Ubuntu, which doesn't write to the volume frequently like Windows does, was able to read the disk properly and retrieve all the data intact, saving me from having to use an older backup. Finally, running the Seatools DOS diagnostic reported that the disk has no problems, ie. SMART errors and no bad sectors, apparently. This, in combination with the relative youth of the disk, suggests that something else is broken. The cable? The PSU? The integrated disk controller? But what would be a good way to diagnose the problem without risking damaging the data? I intend to extract the disk and try it in an external eSATA enclosure and see if the write errors cease, but in the event of the disk appearing to be fine, I would like to be able to confirm what part of the hardware is actually broken here in order to know just what needs replacing. Are there any good ways to go about this?

    Read the article

  • How to diagnose disk errors when disk appears to be ok?

    - by Kylotan
    I have a six-month-old 1TB Seagate drive formatted into 2 NTFS partitions, and the disk appeared to be failing with Windows dropping down from UDMA to PIO mode, reporting Delayed Write Errors, and hanging Explorer when browsing directories. My initial suspicion was that the disk was dying. However, on further examination it appears that Ubuntu, which doesn't write to the volume frequently like Windows does, was able to read the disk properly and retrieve all the data intact, saving me from having to use an older backup. Finally, running the Seatools DOS diagnostic reported that the disk has no problems, ie. SMART errors and no bad sectors, apparently. This, in combination with the relative youth of the disk, suggests that something else is broken. The cable? The PSU? The integrated disk controller? But what would be a good way to diagnose the problem without risking damaging the data? I intend to extract the disk and try it in an external eSATA enclosure and see if the write errors cease, but in the event of the disk appearing to be fine, I would like to be able to confirm what part of the hardware is actually broken here in order to know just what needs replacing. Are there any good ways to go about this?

    Read the article

  • wget mirroring, subdomains and directories and cookies

    - by Jimmu
    Hi all, I have an account on a web page that is now "full" (ie I have used up all my allocated space) and I would like to make a mirror of that site. wget seems like the thing to use. The problem is that I would only like to mirror the sites the lie within this directory http://user.domain.com/room/2324343/transcript/ (and sub-directories). Whilst saving the correct stylesheets, javascripts and css etc which exist in different directories. There as also uploaded files that are linked to within the pages in the transcript directory (on different directories) that I would like to download/mirror (theses are in a variatey of formats .exe, .py, .png, .app (and many more)). There are also images that are on different severs that are on these pages. Also I would like it if the links (which are sometimes relative , sometimes absoulute (but to internal things), sometimes external ) worked correctly so that if they link to things that have been downloaded(mirrored) they work fine (without internet connection), but if they link to things that are external or havent been mirrored they link to the external site. Basically so they work as expected. Another problem is that you have to log in to acess the site. Can wget be used to acomplish this or is there a better way? either way how do I achive this? (I have asked this question at stackoverflow.com/questions/2190115/wget-mirroring-subdomains-and-directories-and-cookies but it was recommended that I try asking it here)

    Read the article

  • On a dual-GPU laptop, is using the discrete GPU ever more power efficient?

    - by Mahmoud Al-Qudsi
    Given a laptop with a dual integrated/discrete GPU configuration, is it ever more power efficient to use the discrete GPU instead of the integrated? Obviously when writing an email or working on a spreadsheet, the integrated GPU will always use less power. But let's say you're doing something graphics-medium but not graphics-intensive/heavy - is there a point where it actually makes sense to fire up the discrete GPU, not for performance but for power-saving reasons? Off the top of my head, I can think of a scenario where the external GPU supports hardware decoding of a particular video codec - I'd imagine there is a "price point" where using the GPU saves more energy than decoding that fully in software would. But I think most GPUs, integrated or discrete, pretty much decode just the plain-Jane h264. But maybe there is something more complicated, perhaps if you're doing something like desktop/windowing animations or a flash animation on a website (not an embedded flash video) - maybe the discrete GPU will use enough less power to make up for switching to it? I guess this question can be summed up as to whether or not you can say beyond doubt that if you don't care for performance on a laptop with two GPUs, always use the integrated GPU for maximum battery life.

    Read the article

  • System hangs while rebooting on Debian...

    - by Usman
    Hi, I have Debian (Kernel 2.6.26-2-686) installed on two computers. On one of them it reboots quite finely but I am having following problem with rebooting Debian on my second computer. When i type reboot at the Linux prompt, following messages appear and system hangs up after saying "Restarting System": Broadcast message from root@myname (tty1) (Sun Jan 17 11:23:26 2010) The system is going down for reboot NOW! INIT: Switching to runlevel: 6 INIT: Sending processes the TERM signal Saving system clock Stopping enhanced syslog: rsyslogd. Asking all remaining processes to terminate...done. Deconfiguring network interfaces...done. Cleaning up ifupdown.... Deactivating swap...done. [ 31.789103] Restarting System. _ Normally when the sytem is busy "" sign blinks but "" at the last line above does not blink which shows, the system hanged up. I tried all keys but the screen is still frozen at the same point. The difference that I noted between my two computers is that I don't have ACPI support in the BIOS of the system which is giving me this error whereas the BIOS of my first computer do have ACPI support on which Debain do not give this restart-hanging problem. I have also disabled running the acpid script by running update-rc.d -f acpid remove but the problem still persists on the second computer. Any ideas to solve or get around this problem?

    Read the article

  • Can't connect two PCs to a Network Switch at the same time (Windows 7)

    - by puk
    I have two computers connected to a network switch and every once in a while one of the computers will lose its internet connection. It's almost always the same computer every time. However, if I play around with the control panel, I can switch it, so that now the other computer is not connected. Restarting either of the computers does not help either. In Windows, the worlds-greatest-trouble-shooter tells me that a network cable is unplugged and that I should try plugging it in...Disabling and re-enabling my NIC does not fix this problem, neither does swapping cables around. When rebooting, the BIOS complains about how the Ethernet Cable is not plugged in. If it's in any way important, My set up at the office is like so: Modem - Routher - Network Switch 1 - Network Switch 2. I have tried turning off the energy saving option for my NIC, and I tried manually setting the link-speed to 100Mbps Full Duplex without any luck. Also, I have a Realtek PCIe GBE Family controller on both computers Does anyone have any idea why this is happening every 5-10 days? EDIT: I have also tried using a completely different Network Switch and the problem still persists as before.

    Read the article

  • Synergy on macbook with osx mavericks wifi connection

    - by user332956
    I'm trying to set up Synergy with my macbook pro running OS X 10.9.3 as a client and my Windows 7 desktop as a server. I'm having some pretty bad connection problems though when I try to use my mac. Every couple seconds the mouse or the keyboard will stop working entirely then come back. I ran some tests and found that the ping from my desktop to my mac would be very high every third ping or so(1000+ ms) or sometimes even time out. If I ping my desktop from my mac the pings are all reasonably low. I believe that this is a power saving feature of Mavericks and I have found a way to get around it by continually pinging my router on my mac, keeping my wifi card from going to sleep. I'm using this right now to type this up with synergy and have had zero issues. Has anyone else ran into this issue and found a better solution? So far, I think my best bet would be to buy an ethernet adapter but I'd rather not have yet another cable running across my desk.

    Read the article

  • Samba Share - MS Excel when saved (can't access the file, there are several possible reasons)

    - by brain90
    Dear Fellow ServerFaulter, I have a weird problem in my samba share. I have one share definition for 3 client (A,B,C) This share contain some excel file which having a lot of formula and linked each other. Client A access the file with libre office (ubuntu), client B access with WinXP & MS Office 2003, The write and read process working successfuly on Both of them. The problem occur when client C accessing the same file with MS Excel 2003 (windows xp). This messagebox appear when he saving the file : Microsoft office excel cannot access the \\192.168.1.23\myshare\ There are several possible reasons: - The File ort path does not exist The file is being used by another program. - The workbook you are trying to save has the same name as a - Currently open workbooks. I was trying http://support.microsoft.com/kb/291204 but it didnt work. Below is my share definition : [brainshare] comment = brainshare path = /opt/brainshare/ valid users = @brainshare force group = brainshare read only = No create mask = 0775 veto files = /*.scr/*.eml/thumbs.com/ Help me please... Thanks in advance ! Server: Ubuntu 10.10, Samba version 3.5.4

    Read the article

  • Incomplete Apache logging

    - by Manz
    I have a problem with Apache running on a Linux server. This error undefined index on PHP, for example. The problem is that my Apache server doesn't log entire error messages. Some lines from the error.log file: [Thu Nov 29 05:29:06 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: lin [Thu Nov 29 05:29:06 2012] [warn] mod_fcgid: stderr: 9 [Thu Nov 29 05:31:30 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: link in /var/www/html/sit [Thu Nov 29 06:01:18 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: link in /var [Thu Nov 29 06:06:09 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined [Thu Nov 29 06:06:15 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: [Thu Nov 29 06:13:04 2012] [warn] mod_fcgid: stderr: PH [Thu Nov 29 07:14:16 2012] [warn] mod_fcgid: stderr: PHP Notice: Undef [Thu Nov 29 07:32:16 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: link in /var/www/ht [Thu Nov 29 07:34:26 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: link [Thu Nov 29 07:34:30 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: link in /var/www/html/site.com/ [Thu Nov 29 07:41:10 2012] [warn] mod_fcgid: stderr: PHP Notice: Und [Thu Nov 29 07:41:11 2012] [warn] mod_fcgid: stderr: PHP Notice: Und [Thu Nov 29 07:41:12 2012] [warn] mod_fcgid: stderr: PHP Notice: Und [Thu Nov 29 08:14:20 2012] [warn] mod_fcgid: stderr: PHP Notice: Undef [Thu Nov 29 12:36:54 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: li [Thu Nov 29 12:37:04 2012] [warn] mod_fcgid: stderr: PHP Notice: Unde [Thu Nov 29 12:46:52 2012] [warn] mod_fcgid: stderr: PHP Notice: Undefined index: link in /var/www/htm [Thu Nov 29 13:00:33 2012] [warn] mod_fcgid: stderr: line 35 [Thu Nov 29 13:10:55 2012] [error] [client XXX.XX.XX.XX] File does not exist: /var/www/h Some lines are incomplete and truncate the error message. Anyone know Why Apache is saving incomplete error messages?

    Read the article

  • Help Email Account Management among multiple users

    - by CogitoErgoSum
    So I preface this with saying this may belong in IT Security, not too sure feel free to move. Currently we have an email account [email protected] - hosted via google apps (as is all our email). We had an incident where we had to terminate an employee. This employee however had the password for this account as we have 20-30 people utilizing it at any given point to manage customer emails etc. Thinking on this I feel there must be a better way to manage access. With Google you can associate upto 10 email accounts to another the problem is we have more like 20-30 people going. We were evaluating tools such as SalesForce and Assistly where people have their own login credentials and then the system contains the appropriate smtp information for the [email protected] email address to send emails from it rather than a users personal account. Aside from those options does anyone have any other thoughts? One suggestion floated was moving everyone to desktop clients and saving the PW info there so they could only login from their physical workstation but we may have situations where we'd like employees to work remotely. Does anyone have experience with this sort of system where ~20-30 people are responding from one email box and how to manage security and access?

    Read the article

  • SYS-5016T-MTFB will not POST without manual assistance (Motherboard: X8STi-F)

    - by Dan
    I have a Supermicro 5016T-MTFB 1U server which I am in the process of setting up, but it has a really strange problem. When the system is powered on it will not POST until I press the reset button a few times, followed by pressing the delete key on the keyboard to "wake it up". If I power it on and do nothing, the fans spin up but nothing else happens at all. After pressing the reset button once, the red "overheat" light comes on and blinks which is supposed to indicate a fan failure - but all the fans are working. Pressing reset again usually stops the blinking, and the system starts the normal POST routine but it will not actually get to the bios screen unless I press delete. If I don't press delete, it just continues to hang. After pressing delete it will take me into the bios setup screen, if I exit without saving changes I can boot the system normally. I was able to successfully install Linux with no trouble...but upon rebooting the same problem happened again. This board has integrated IPMI which I thought was the problem, so I disabled it via the jumper on the board. Did not help. Each time this system powers on, it goes on for a second, then turns off again for another second, then turns back on again. I don't know why it does that. Here is what I put in the system: 1 x Xeon E5630 (Nehalem) 80W TDP (it's not overheating, CPU temps stay under 40 degrees C) 2 x Kingston 2GB x 3 DDR3-1066 Memory ECC, unbuffered, unregistered (kvr1066d3e7sk3/6g) 1 x Intel X25-M 160 GB 2 x Western Digital RE3 1TB

    Read the article

  • I am trying to set up a ubuntu sever 12.04 on my machine [migrated]

    - by Jseb
    I am trying to set up a server on my home network which will eventually host rails. I am not great in linux server and i try to follow the prompt. I did succesfully get to a black screen which then prompts me to a username then password to then do anything ( assuming). However here what i try to do I kinda fellow his tutorial http://www.ubuntugeek.com/step-by-step-ubuntu-11-04-natty-lamp-server-setup.html but however the command where not 100% like him not in same order but same idea. Then i want to install ubuntu server with gui here the command i try with sudo apt-get upgrade sudo apt-get install ubuntu-desktop Which however give me the following error Err http... inRelease w Failed to fetch ht... So been ignored if i try the desktop one i get E: unable to locate package ubuntu E: unable to locate package desktop So i am assuming i am not connected to the internet, so i try the following command sudo vi /etc/network/interfaces here the output it gives me and i know my gateway on my laptop is 192.168.1.1 address: 192.168.1.148 netmask: 255.255.255.0 network: 192.168.1.0 broadcasts: 192.168.1.255 gateway: 192.168.1.1 Btw i do not know the command to get out of vi and saving it. Err http://us.archive.ubuntu.com precises InRelease Err http://us.archive.ubuntu.com precises-updates InRelease Err http://us.archive.ubuntu.com precises-backports InRelease Reading package lists... Done W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/InRelease W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/InRelease W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backport/InRelease

    Read the article

  • Graphic Design in Outlook HTML Emails

    - by PhilPursglove
    At the moment we are creating artwork in Word and saving it as an HTML file. Opening up a new email, clicking insert on menuclicking ‘File’Selecting HTML file and choosing insert as text. The word document is then embedded into the email and we can create HTML links from there. The problem with this method is we are limited to what we can create visually in Word. The artwork just does not look professional enough and we find that sometimes the headers or footers do not appear or do not stay in their correct position. What I would like to do is to be able to start in Adobe InDesign (the graphics package we use). So far I have been able to create artwork in InDesign and create buttons and hyperlinks in InDesignExport it as a pdf, maintaining the hyperlinksSave as HTML documentOpen new emailInsert HTML file choosing insert as text. The problem with this method is that the images move about, the text is all different sizes, but on the plus side, the hyperlinks have been retained. So I am almost there, but not quite. Can anyone suggest what I need to do to get the design to display 'correctly' in Outlook.

    Read the article

  • Excel file growing huge (>150 MB)

    - by Josh
    There is one particular Excel file that is used by a number of employees at my company. It is edited from both Excel 2003 and 2007, with the "Sharing" feature turned on to allow multiple writers at once. The file has a decent amount of data on several sheets with some basic formatting, and used to be about 6MB, which seems reasonable for its content. But after a few weeks of editing, the file grew to 10, then 20 MB, and eventually skyrocketed to more than 150 MB, even though it still has about the same amount of data as before. It now takes 5-10 minutes to open it, and that much time again to save it. The first time this happened, I copied the content of each sheet into a new, blank workbook, and saved the new workbook; this brought it back down to about 6MB. Now, it has blown up again. The workbook uses the "Data Validation" feature to limit the values in certain columns to the contents of a few named ranges. Copying all the data into a new workbook means re-setting up all the data validation, which is a pain and not something that we want to do every month. As a troubleshooting step, I tried saving the file in "XML Spreadsheet 2003" format, hoping to get some insight into what was being stored. Sure enough, the file was almost a gig, and almost all of the 10 million lines look like this: <NamedCell ss:Name="Z_21D5114F_E50C_46AC_AA4F_C3FF540C717F_.wvu.FilterData"/> <NamedCell ss:Name="Z_1EE2BA5E_3011_4F9A_8ACD_E58835250FC4_.wvu.FilterData"/> <NamedCell ss:Name="Z_1E3BDCEA_6A72_4ECC_BF4F_7B03CC66181E_.wvu.FilterData"/> I've seen a few VBScripts online to manage and enumerate named cells that are hidden in Excel's built-in interface, though I wonder how they'd handle my 10 million named cells. What I really need, though, is an understanding of why this keeps happening. What actions in excel could be causing this?

    Read the article

  • Python coding with VLC player (quite a basic query I expect)

    - by Todd
    I'm fairly new to the whole coding realm so my knowledge is fairly limited, and I can't seem to find any basic tutorials on how to use scripts with VLC player. More specifically, the reason I'm asking here is because I stumbled across a post on this site about playing random clips from random videos on VLC player automatically. This is the forum post: Playback random section from multiple videos changing every 5 minutes My situation is similar to this lovely gentleman's was, though he clearly knows a lot more about coding than I do. In short, I'd like to copy this coding into a file of some sort and apply it to VLC player myself. Only I'm not sure what file type I'd have to save it as (I have Python by the way, and I tried saving it as a .py file but I didn't know if it was correct or where to go from there). Additionally, I'm not sure how to get VLC to "read" the script, so to speak - is there a specific location the file needs to be, and do I run the script from another program or through VLC? I'll reiterate that I'm relatively new to this, so if anybody would be so kind as to post a quick list of steps on how to save/place the file and use it with VLC player I really would appreciate it! P.S. I'm not computer illiterate, I'm fine with most programs and I'd understand if you just said things like "C:\Program Files (x86)\VideoLAN\VLC\plugins" or "in VLC, select Tools Plugins and extensions", I just wouldn't catch on to anything about adding a line of coding that does something without being told exactly what to write! Many thanks in advance! :) Todd

    Read the article

  • Why is my Network Connections list empty?

    - by DealerNextDoor
    I'm sure this question has been asked before, but none of the links I have found have worked. I've been trying to find fixes for the past couple of weeks. This all started a few days after I got my router. At first, I thought it was just something that would fix it self. But as usual, it never does. I am trying to update my router's wireless card to try and fix this problem, but I need to get the card's information to update it on the HP website. And since my Network Connections list is empty, I can't get any information about it. So to get around this, I tried to go to 'Manage Wireless Networks', and when I tried to get the properties from there, I get this error: Windows has encounter an error saving the wireless profile. Specific Error: The data is invalid. So, what all can I do to try and fix this? Any help will be appreciated. EDIT: Sorry, forgot to put router info. Router Model: WNR1000v2-V2 Router Maker: NETGEAR Router Firmware Version: V1.0.0.12NA The router is up-to-date on all updates.

    Read the article

  • RabbitMQ message consumers stop consuming messages

    - by Bruno Thomas
    Hi server fault, Our team is in a spike sprint to choose between ActiveMQ or RabbitMQ. We made 2 little producer/consumer spikes sending an object message with an array of 16 strings, a timestamp, and 2 integers. The spikes are ok on our devs machines (messages are well consumed). Then came the benchs. We first noticed that somtimes, on our machines, when we were sending a lot of messages the consumer was sometimes hanging. It was there, but the messsages were accumulating in the queue. When we went on the bench plateform : cluster of 2 rabbitmq machines 4 cores/3.2Ghz, 4Gb RAM, load balanced by a VIP one to 6 consumers running on the rabbitmq machines, saving the messages in a mysql DB (same type of machine for the DB) 12 producers running on 12 AS machines (tomcat), attacked with jmeter running on another machine. The load is about 600 to 700 http request per second, on the servlets that produces the same load of RabbitMQ messages. We noticed that sometimes, consumers hang (well, they are not blocked, but they dont consume messages anymore). We can see that because each consumer save around 100 msg/sec in database, so when one is stopping consumming, the overall messages saved per seconds in DB fall down with the same ratio (if let say 3 consumers stop, we fall around 600 msg/sec to 300 msg/sec). During that time, the producers are ok, and still produce at the jmeter rate (around 600 msg/sec). The messages are in the queues and taken by the consumers still "alive". We load all the servlets with the producers first, then launch all the consumers one by one, checking if the connexions are ok, then run jmeter. We are sending messages to one direct exchange. All consumers are listening to one persistent queue bounded to the exchange. That point is major for our choice. Have you seen this with rabbitmq, do you have an idea of what is going on ? Thank you for your answers.

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >