Search Results

Search found 4745 results on 190 pages for 'spare parts'.

Page 85/190 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • Website loading until initial script finishes

    - by wardy277
    Hi, i have a highly used server (running plesk). I have some long scripts that take a while to process (huge mysql database). I have found then in 1 browser, i run the script and while it is loading i cannot view any other parts of the site until the script finishes, it seems that all the requests go off, but they don't get served until the initial script finishes. i thought this may be a server wide issue, but it is not. If i use another computer i can view the site fine, even on the same computer with a different browser i can navigate fine, while the script still loads. I think it much limit the number of requests per session. Is this correct? is there any way to configure this to allow for 2-3 other requests per session? It is really bad that when i am on the phone to a client, i have just run a long report, but cannot use the site or follow what they are saying until the page has loaded? Chris

    Read the article

  • Avoiding syslog-ng noise from cron jobs [closed]

    - by Eyal Rozenberg
    Possible Duplicate: How can I prevent cron from filling up my syslog? On my small Debian squeeze web server, I have syslog-ng installed. Generally, my logs are nice and quiet, with nice -- MARK -- lines. My syslog, however, is littered with this Sep 23 23:09:01 bookchin /USR/SBIN/CRON[24885]: (root) CMD ( [ -x /usr/lib/php5/maxlifetime ] && [ -d /var/lib/php5 ] && find /var/lib/php5/ -type f -cmin +$(/usr/lib/php5/maxlifetime) -delete > /dev/null) Sep 23 23:09:01 bookchin /USR/SBIN/CRON[24886]: (root) CMD ( [ -d /var/lib/php4 ] && find /var/lib/php4/ -type f -cmin +$(/usr/lib/php4/maxlifetime) -print0 | xargs -r -0 rm > /dev/null) Sep 23 23:17:01 bookchin /USR/SBIN/CRON[24910]: (root) CMD ( cd / && run-parts /etc/cron.hourly) kind of garbage. What's the clean way to avoid it?

    Read the article

  • Remote RAID Control in ESXi on a Dell PowerEdge 2950 Using OpenManage

    - by yoyomommy
    I was wondering how one can add a drive into an existing RAID array while ESXi is still running. I have read that you are able to use Dell OpenManage to do this. I have installed OMSA 7.0 on the VMWare ESXi host (5.0 and fully updated) and I've installed OpenManage Essentials on a Windows Server 2008 R2 guest. The issue that I'm having is that OpenManage is unable to see my RAID controller. I have seen videos and photos as parts of guides on how to do this online, so I would assume that the functionality exists and I just have it set up wrong.

    Read the article

  • Windows 7 just corrupted Microsoft F# and now won't fix itself.

    - by user225626
    I lost half a year's worth of custom CGAL/GMP/LEDA/Qt3/Qt4/MPFR/MPFI/Visual Studio/Boost/blahblahblah open source nonsense with DLLs that are scattered in various parts of the OS that I had gotten to finally work. All because F#, which came with .NET 4 and which I never, ever used, is now "corrupted" one day for reasons only Windows 7 knows, and now Windows 7 won't even boot. Of course none--none--of the above setup works as accessed from the USB port. How do I remove a corrupted file that prevents a Windows 7 from booting, or even being fixed by an external CD mount of the Windows 7? Thanks for any help.

    Read the article

  • ldirectord refusing connection when nginx redirects from http to https

    - by Adam
    I am running ldirector as a load balancer to an nginx front end server. If I setup a redirect from http to https and connect directly to the nginx server, all is well. Connecting via ldirector causes my connection to be refused. I can connect normally via http or https through ldirector when I don't have the redirect in place. To add to my confusion, if my application issues a redirect from http to https, it works. I am testing this via curl on the command line. (curl: (7) couldn't connect to host vs a response) I am using the standard ldirectord config (http://www.ultramonkey.org/3/topologies/config/lb/non-fwmark/linux-director/ldirectord.cf) the http and https parts. My nginx config for the redirect is simply: location / { rewrite ^(.*) https://$host$1 permanent; }

    Read the article

  • Family Tree :- myheritage.com

    - by Nitesh Panchal
    Hello, The other day i just accidently visited the site myheritage.com. I was just wondering, how they must have created one? Can anybody tell me what can be their database design? and if possible, algorithm that we can use to generate such a tree? Generating simple binary tree is very easy using recursion. But if you have a look at the site(if you have time please make account on it and add few nodes to feel) when we add son to a father, it's mother is automatically added(if you don't add explicitly). Mother's family tree is also generated side by side and many such fancy things are happening. In a simple binary tree we have a root node and then many nodes below it. Thus we cannot show wife and husband in the tree and then show a line from wife and husband to child. In spare time, can anybody discuss what can be it's database design and the recursive algorithm that we can follow to generate it? I hope i am not asking too much from you :).

    Read the article

  • help building a PC that can image a dozen hard drives simultaneously

    - by Bigbio2002
    Not sure if this belongs on here or SuperUser, but here goes... I'm trying to figure out how to make a mass hard drive imaging PC out of COTS parts. A dedicated imaging device can do 10 drives at a time, but costs several thousand dollars. So far, I'm thinking to use several 3-port PCI-E Firewire cards, and use some kind of Firewire-to-IDE adapter to connect the drives themselves. The "software" would consist of scripting diskpart, or some other imaging utility. The problem is that I can't seem to find any sort of adapter. I could use standard external hard drive bays, but then I'd have a dozen power cables that I need to plug in. Ugly, messy, and inefficient. I picked Firewire over USB not only for better transfer speeds, but also because FW can deliver power over the bus (and could theoretically power a hard drive). Does anyone have any input on this?

    Read the article

  • OpenSCAD keeps crashing when I raise my precision past $fn=29?

    - by Jeremy Quick
    Out of the blue, OpenSCAD decided to stop working for me on designs that previously rendered just fine. After playing around I found out that any time I use a precision greater than $fn=29 the program crashes with the message "openscad.exe has stopped working". I can use any precision 29 and below, but the second I adjust it to even 30 the program crashes. This is a major problem as before I was using a precision of 100, and without it the design's moving parts do not work on print. I have uninstalled and reinstalled the program several times and haven't been able to fix the problem. I have watched the CPU during the calculations and it never reaches more than 35%. Does anyone have any suggestions as to how to fix this problem? I also have not changed computers or even the code. Thanks! I am using Windows 32 bit, Version 2013.06. The default version from the website.

    Read the article

  • How to ship a server internationally?

    - by devians
    I have a fileserver that will need to be shipped internationally soon. I'm looking on advice for packing and recommendations on methods/companies. Should it be shipped whole, or in parts? How to pack it, precautions to take. Any way you slice it, it's going to be very heavy. Will this cause problems? Whats the best way to protect it from shock? It would be pointless for it to arrive with broken hdd's. If you've done this before, your hindsight would be invaluable.

    Read the article

  • Decent 1gb switch (16-24 port) for rack...

    - by TomTom
    Hallo, for a rack containing a smaller nubmer of servers (5 at the moment, going to stay in this area), I look to replace the currently aging 100mbit switch with a 1gb switch. This is for the backend between the servers. I expect some ISCIS traffic there ,so a 10gbit option would be nice (preferably for two ports, as extension modules). I dont need management, this is a pure backend of an internal cluster. I do VLAN, but there is no sensible management the switch can do there. I wuold like: * 1he only, obviously * preferable limited moving parts. * Low price ;) * Enough power to run at least half the ports in full speed at the same time. Anyone any recommendations?

    Read the article

  • Filtering semi-solicited spam

    - by Ketil
    While traditional UCE (get rich quick, enlarge your body parts, Nigerian barristers) are handled adequately, I'm still receiving a lot of not quite unsolicited spam. This is typically from commercial services forwarding "invites" from my "friends" and then "reminding" me of their services. Typical offenders are Facebook, Linkedin, dropbox, bebox, etc etc. (Of course, none of these services provide any way of opting out, except possibly by registering, which they will then take as an invitation to stuff your mailbox with even more crap) What is a good way to deal with these? I can of course junk them using procmail, but is it a better idea to e.g. bounce them, or at least send a reply informing the sender (and "friend") that I am not interested in their service nor their spam. Any solutions to this?

    Read the article

  • Plesk directory structure problems

    - by johnnietheblack
    I have an entire website with the following directory structure: /example.com /html (public) /css /js index.php /lib session.php other_lib_files.php /views index.php /models /controllers As illustrated, the html is public, and anything above it is private. My site now needs to upgrade servers, and the new server (Linux w/ Plesk) has the following structure (reduced to the problematic parts below): /myplesksite.com /httpdocs /css /js index.php /private /lib /models /views What I would THINK is that I should be able to put my /lib, /views, /models, etc in the directory directly above /httpdocs, the same way I had it in my previous server. Is that possible? Or do I have to put it in private? I would really love not to have to adjust my internal paths throughout the site if not necessary...

    Read the article

  • Why don't I just build the whole web app in Javascript and Javascript HTML Templates?

    - by viatropos
    I'm getting to the point on an app where I need to start caching things, and it got me thinking... In some parts of the app, I render table rows (jqGrid, slickgrid, etc.) or fancy div rows (like in the New Twitter) by grabbing pure JSON and running it through something like Mustache, jquery.tmpl, etc. In other parts of the app, I just render the info in pure HTML (server-side HAML templates), and if there's searching/paginating, I just go to a new URL and load a new HTML page. Now the problem is in caching and maintainability. On one hand I'm thinking, if everything was built using Javascript HTML Templates, then my app would serve just an HTML layout/shell, and a bunch of JSON. If you look at the Facebook and Twitter HTML source, that's basically what they're doing (95% json/javascript, 5% html). This would make it so my app only needed to cache JSON (pages, actions, and/or records). Which means you'd hit the cache no matter if you were some remote api developer accessing a JSON api, or the strait web app. That is, I don't need 2 caches, one for the JSON, one for the HTML. That seems like it'd cut my cache store down in half, and streamline things a little bit. On the other hand, I'm thinking, from what I've seen/experienced, generating static HTML server-side, and caching that, seems to be much better performance wise cross-browser; you get the graphics instantly and don't have to wait that split-second for javascript to render it. StackOverflow seems to do everything in plain HTML, and you can tell... everything appears at once. Notice how though on twitter.com, the page is blank for .5-1 seconds, and the page chunks in: the javascript has to render the json. The downside with this is that, for anything dynamic (like endless scrolling, or grids), I'd have to create javascript templates anyway... so now I have server-side HAML templates, client-side javascript templates, and a lot more to cache. My question is, is there any consensus on how to approach this? What are the benefits and drawbacks from your experience of mixing the two versus going 100% with one over the other? Update: Some reasons that factor into why I haven't yet made the decision to go with 100% javascript templating are: Performance. Haven't formally tested, but from what I've seen, raw html renders faster and more fluidly than javascript-generated html cross-browser. Plus, I'm not sure how mobile devices handle dynamic html performance-wise. Testing. I have a lot of integration tests that work well with static HTML, so switching to javascript-only would require 1) more focused pure-javascript testing (jasmine), and 2) integrating javascript into capybara integration tests. This is just a matter of time and work, but it's probably significant. Maintenance. Getting rid of HAML. I love HAML, it's so easy to write, it prints pretty HTML... It makes code clean, it makes maintenance easy. Going with javascript, there's nothing as concise. SEO. I know google handles the ajax /#!/path, but haven't grasped how this will affect other search engines and how older browsers handle it. Seems like it'd require a significant setup.

    Read the article

  • An online php debugger/code editor

    - by Zirak
    It's a simple deal: I'm sometimes in places where I don't have my laptop, and find myself with spare time and an idea for a project. But unfortunately, I can't do anything about it. I tried a variety of solutions, which include running IDEs (like phpstorm or Aptana) on a disc-on-key or cd (very slow and unappealing), trying several online solutions (like http://phpanywhere.net) and found that all of them are either buggy, overloaded or underloaded with features, just difficult to use, require FTP etc etc. All that is required here is a syntax highlighting and debugging alerts; no actual running of code. So the question is split into two: 1)Do you know of a good online php editor that you've used and enjoyed? 2)If no, then how would you go about making one? The second one seems a bit general, so I'll try and expand...It might be a good idea; if you can't find one, make one. The question is about the concept of making a syntax highlighter (shouldn't be too difficult), and the difficult part of catching php errors WITHOUT executing any php code. Thank you in advance.

    Read the article

  • Codeigniter: Library function--I'm stuck

    - by Kevin Brown
    I have a library function that sets up my forms, and submits data. They're long, and they work, so I'll spare you reading my code. :) I simply need a way for my functions to determine how to handle the data. Until now, the function did one thing: Submit a report for the current user. NOW, the client has requested that an administrator also be able to complete a form--this means that the form would be filled out, and it would CREATE a user at the same time, whereas the current function EDITS and is accessed by an EXISTING user. Do I need a separate function to do essentially the same thing? How do I make one function perform two tasks? One to update a user, and if there is no user, create one. Current controller: function survey() { $id = $this->session->userdata('id'); $data['member'] = $this->home_model->getUser($id); //Convert the db Object to a row array $data['manager'] = $data['member']->row(); $manager_id = $data['manager']->manager_id; $data['manager'] = $this->home_model->getUser($manager_id); $data['manager'] = $data['manager']->row(); $data['header'] = "Home"; $this->survey_form_processing->survey_form($this->_container,$data, $method); } Current Library: function survey_form($container) { //Lots of validation stuff $this->CI->validation->set_rules($rules); if ( $this->CI->validation->run() === FALSE ) { // Output any errors $this->CI->validation->output_errors(); } else { // Submit form $this->_submit(); } $this->CI->load->view($container,$data); The submit function is huge too. Basically says, "Update table with data where user_id=current user" I hope this wasn't too confusing. I'll create two functions if need be, but I'd like to keep redundancy down! }

    Read the article

  • how to customize debian installation cd for your needs

    - by Frank
    I have with me a Debian CD, which I want to customize for my own needs. I have extracted the CD and started to change some parts of it, e.g Splash screen (splash.png) installer Title (through isolinux.cfg) etc These are the things that I want to do: Change the Splash logo at start up of installation to have my own (which is done) Change the grub boot parameters to use my comapny name on it. Change the set of packages in it, so that I can have my own set of packages in it and only those packages are installed Do some post installation steps Customize it's startup and login screen to have my company name. After I am done with this customization, I need to build its live installer CD so that I can install it on my own, on any other system.

    Read the article

  • Huge HDD response time in Resource monitor

    - by Mille
    Just bought all parts for a computer and put it together and installed a fresh version of windows 7. After a while, when using the computer it gets very slow, and even closing down windows can take several minutes. I started to look in the resource monitor and though I found the answer watching my hdd. The thing is that the hdd completes all tests in Seagate's SeaTools for Windows successfully. Which makes me doubt on the problem and weather I can send it in to get an replacement. Suggestions on what it could be and what I can do about it? Here a screenshoot from the resource monitor:

    Read the article

  • Can I prevent Internet Explorer 8 from running scripts until the page is loaded?

    - by Tom W
    When trying to veiw a number of different websites in IE8, I get the following error message: HTML Parsing Error: Unable to modify the parent container element before the child element is closed (KB927917) On investigating the error message, it appears the script is trying to modify parts of the page before it is fully loaded. Is there a setting in IE8 I can change to prevent scripts from running until the page is fully loaded? EDIT: The sites in question used to work just fine, until I had to re-install IE8 for a seperate issue. Then they stopped working.

    Read the article

  • Tomcat deploy: make included scripts executable

    - by AlexS
    I'm devellopping a WebApplication (for Tomcat) using netbeans on Windows 7. For the Webapplication to run I need to run a insall-script once. This script (*.bat for windows and *.sh for linux is included in my war-file (WEB_INF). Now everytime I deploy the WAR-file and want to run the script on linux I have to call chmod +x install.sh first. Is there a way that this script can be made executable by default? I don't want to have to execute some extra commands after the deploy to make the script executable. For clarification: I'm not new to Linux and I know how to set executable-rights on files. That's not the problem. My problem is: What do I have to do, so that this script is executable right after tomcat deployed my *.war-file (unpacked it). If I would be using Linux for development as well, I would try to set the rights according in my sources (maybe I'll try it when I have a little more spare time). But I am using Windows and netbeans. Are there any attributes I can set to achive my goal, or is it possible to achive this using ant? By the way: Are there security related issues with this approach? The script looks for java executable and calls a javabased GUI-installer...

    Read the article

  • Why do certain replied emails missing threading when replied back in Mutt?

    - by yarun can
    I use Mutt for emails. I have threads enabled and I can see that most of the emails are threaded in Mutt. So that is all good. But sometimes I reply to an email and the answer(from other person) to my replied email wont be part of any threads. The thing is that when I reply in Mutt (which I use Vim to edit them), the subject parts keeps getting longer and longer with many "Re"s. That is the case with those emails with missing threads. I have: set strict_threads="yes" set sort="threads" set edit_headers=yes I am wondering if this has anything to do with Mutt or the person I am communicating over email. Could this one be the culprit? set metoo=yes Any suggestions?

    Read the article

  • Initial capacity of collection types, i.e. Dictionary, List

    - by Neil N
    Certain collection types in .Net have an optional "Initial Capacity" constructor param. i.e. Dictionary<string, string> something = new Dictionary<string,string>(20); List<string> anything = new List<string>(50); I can't seem to find what the default initial capacity is for these objects on MSDN. If I know I will only be storing 12 or so items in a dictionary, doesn't it make sense to set the initial capacity to something like 20? My reasoning is, assuming the capacity grows like it does for a StringBuiler, which doubles each time the capacity is hit, and each re-allocation is costly, why not pre-set the size to something you know will hold your data, with some extra room just in case? If the initial capacity is 100, and I know I will only need a dozen or so, it seems as though the rest of that allocated RAM is allocated for nothing. Please spare me the "premature optimization" speil for the O(n^n)th time. I know it won't make my apps any faster or save any meaningful amount of memory, this is mostly out of curiosity.

    Read the article

  • How can I automate my Linux computer to power off (and on preferably) under certain circumstances?

    - by Ashimema
    OK, So a little background; I've been using Windows Home Server as a Backup Appliance, Media Server and Share Server at home for some time. I decided it was costing me allot of juice so very early on added the "Lights Out" add-on to ensure it was only running as and when needed. I'm now looking to switch to a Linux based server and I'm looking for a similar tool/set of tools for advanced power management. Now the question; Anyone got any all-in-one suggestions (i.e with client parts for both Windows and Linux and a server part for the Linux server), or can anyone simply verify that I'll need to set-up all the individual bits for this myself separately? (A tool similar to "[SmartPower][2]" but for linux would be a great start)

    Read the article

  • MySQL Cluster Failover doesn't work

    - by Lukasz
    I have two servers, where First server 10.100.15.150: 1. one mgm server 2. one ndbd 3. one mysql api Second server 10.100.15.160: 1. one ndbd 2. one mysql api When i start all 'parts' of cluster it looks : Cluster Configuration [ndbd(NDB)] 2 node(s) id=21 @10.100.15.150 (mysql-5.1.56 ndb-7.1.17, Nodegroup: 0) id=22 @10.100.15.160 (mysql-5.1.56 ndb-7.1.17, Nodegroup: 0, Master) [ndb_mgmd(MGM)] 1 node(s) id=3 @10.100.15.150 (mysql-5.1.56 ndb-7.1.17) [mysqld(API)] 2 node(s) id=11 @10.100.15.150 (mysql-5.1.56 ndb-7.1.17) id=12 @10.100.15.160 (mysql-5.1.56 ndb-7.1.17) When i shutdown first machine - 10.100.15.150, on second the nbdb process also has been shutdown so i cannot use this data node and cluster fail ... How i must configure this cluster to get FailOver working ? Thx

    Read the article

  • How could I portably split large backup files over multiple discs?

    - by sourcejedi
    Context: I make backups / archives, primarily of photos. I'm experimenting with Bup, which is designed for backup to hard disk. Basically it creates Git repos which include packfiles of up to 1GB. But I still need last-ditch backups to keep offline and move offsite (and keeping them on read-only media is good too!). What are the options for archiving and splitting large files over several discs like CDs (and reading them back!)? I'd prefer methods which will stay readable in future. are portable e.g. to Windows. have known simple implementations, so I could re-implement them myself if necessary. (Using Bup packs will stretch my robustness budget. So I want to be confident about how other parts of the system would behave). I heard split archives are possible with both ZIP and 7-Zip. Is that right?

    Read the article

  • Slow write speeds on new Gigabit home file server

    - by Ryan Holder
    So I finally got all my parts delivered to setup a home file/backup server this week. It's currently running Ubuntu Server and I'm using Samba to share files on my network. The server currently has a 2TB WD Green drive in it connected to a Asus M5A78L-M This is then connected via CAT6a to my new Gigabit switch (TP-Link TL-SG1005D). My home desktop is then also connected to this switch and again also through CAT6a cable. Currently when transfering files I will get a perfect 100MB/s read from the server to my Windows machine. When copying from my Windows machine to the server I get around 30/38MB/s. I know this drive is capable is faster speeds so would anybody have an idea of where the bottleneck is? Any help would be greatly appreciated :) EDIT: I have found ftp's write speed is much closer to what my Samba read speed is so I'm going to give it a guess that is a software problem rather than hardware

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >