Search Results

Search found 21061 results on 843 pages for 'bulid process'.

Page 519/843 | < Previous Page | 515 516 517 518 519 520 521 522 523 524 525 526  | Next Page >

  • strange memory usage pattern on windows server 2008 on login through remote desktop..

    - by headsling
    I'm running Windows Server 2008 Datacenter Service Pack 2 on a VM Ware instance with 10Gb ram allocated. I'm not running IIS or SQL Server. Under 'normal' conditions, the machine uses ~5.5Gb of memory. However, when I login to the server through remote desktop, the memory usage slowly climbs up to 9.8Gb of memory in use. After several minutes the memory slowly creeps back down to the 5.5Gb mark. I've tried killing all the processes associated with my login, on login, barring the taskmanager without success, and I can't see any process that is growing in memory usage when the memory is increasing. I'm assuming this is some system level cache that is growing / shrinking... but why is it doing this?

    Read the article

  • Resize2fs at 81h and counting

    - by Adam
    Setup: 12x 1TB drives in a RAID6 (MDADM) crypt-setup running ontop of MDADM LVM running on the crypted drives EXT4 on the LVM Background: I added a new drive to the RAID (increasing from 11 to 12 drives), and 'bubbled' up through the layers (MDADM, etc...) to reizing the ext4 partition. This machine is used as a centralized repository for photography and as a backup server (for both Windows and Mac machines) so bringing it down to add the drive and wait for the resizing and everything wasn't really an option. So I started the resize operation several days ago. HTOP is reporting the resize2fs operation as running for 81h now. DMESG and syslog are both clear, and the drives are still accessable. The resize command reports it's started an online resize of the partition, so the process IS running, and it is burning through 100% of one of my cores. Question: Is it normal for the operation to take this long or has something gone horribly wrong? Where would I start looking for signs of trouble?

    Read the article

  • Path erased in Debian

    - by Lyon83
    I'm trying to deploy a rails app in Debian, using Apache/Passenger. I was trying to fox a problem with some GEMs and in the process I put executed this in console: export PATH=/var/lib/gems/1.8/bin/:${vendor/cache} Now my path environmental variable is gone, or at least its content. My server is running under Debian 6. Is there a way to recover my path info? Or at least can someone point me where to find the file where that variable i s stored? Some help please. This is a BIG problem for me. Thanks in advance!

    Read the article

  • Monitoring disk block access in Linux

    - by VoidPointer
    Is there a way to gather statistics about blocks being accessed on a disk? I have a scenario where a task is both memory and I/O intensive and I need to find a good balance as to how much of the available RAM I can assign to the process and how much I should leave for the system for building its I/O cache for the block device being used. I suspect that most of the I/O that is currently happening is accessing a rather small subset of the device and that performance could be optimized by increasing the RAM that is available for I/O buffering. Ideally, I would be able to create something like a "heat-map" that shows me which parts of the disk are accessed most of the time.

    Read the article

  • How Do I Migrate 100 DBs From One MS-SQL 2008 Server To Another? (looking for automation)

    - by jc4rp3nt3r
    Let me start by saying that I am not a DBA, but I am in a position where I am responsible for moving just under 100 MS-SQL 2008 DBs from our current development server, to a new/better/faster development server. As this is just a local dev server, temporary downtime is acceptable, but I am looking for a way to move all of the databases (preferably in bulk). I know that I could take a bak of each, and restore it on the new server, but given the volume of DBs, I am looking for a more efficient way. I am not opposed to learning a new piece of software, writing code or any other requirement, so long as it speeds up the process.

    Read the article

  • Restarting nginx backends without losing requests

    - by Oli
    I'm sure it's been asked before in different words but I run several Django sites via uwsgi (emporer mode) behind nginx. It's all a fairly standard configuration but I find that if I restart the central uwsgi process, nginx just bombs out 502s rather than waiting for the socket to become available. I recognise that most of this is probably for a reason but people seeing 502 errors really stings me. It's certainly not something I want a client to see. So... Can I beg nginx to wait/retry backends? Or, Is there anything (other than the obvious) I can do to minimise commercial damage from uwsgi restarts?

    Read the article

  • How I can be alerted if an application is not running in Windows 2008 R2?

    - by Magnetic_dud
    I have a critical application that I need to have it running on my server. Unfortunately it's poorly coded and it keeps crashing. If it's not running it's a big problem, but I can't use a simple application monitor like this because if the app crashes I need to input the configuration again - so I can't just run it again, I have to RDP into the server and manually start it again. So I need a monitor that sends me an email if the process has been stopped. Anyone knows a program that can do that job? I couldn't find it

    Read the article

  • Partner Showcase

    - by rituchhibber
    Building a High Performance Employee Self Service Portal with Oracle WebCenter Free Half Day Technical Workshop Organisations started with static corporate intranets at the beginning of the “Noughties”, these have been evolving to the Intranet Portal that is common today. The rise in Employee Self Service leverages off this evolution to transform the intranet as a resource in order to deliver the “Contextual workers control panel”. This empowers employees to do their complete job from a single environment covering transactions, document handling, form completion, watching presentations, participating in discussions through to utilising search functionality. Ether Solutions - the Enterprise Portal specialists, together with C2B2 - the independent middleware experts, will deliver this workshop to you, allowing you to discover how Oracle WebCenter provides a high performance, highly scalable platform for social intranets and EmployeeSelf Service Portals. To register, please click here. When? Wednesday, 12th of December 2012 Where? Institute of Directors, 116 Pall Mall, London SW1Y 5ED Who should attend? Lead Developers, Technical Architects, Solution Architects, Technical Leads and other Technical team member interested in learning about WebCenter. Lingotek - Collaborative Translation Technology Lingotek is the leading provider of Collaborative Translation Technology designed to meet the requirements of organizations challenged with communicating, interacting, and commercializing a global audience. Lingotek software helps companies achieve unprecedented control over the translation process and enables companies to capture, grow, and reuse their linguistic assets. Lingotek has deployed systems for some of the most innovative organizations in the United States and has enabled the success of large Fortune 500 corporations, small professional firms, and companies of every size in between. For further information, please click here.

    Read the article

  • trying to upgrade memory

    - by user214876
    I've been using Ubuntu on my laptop for awhile now. Not quite used to it yet. I've got a Acer Aspire with an orig 4 gig mem/500 gig HDD. Running 12.04 presently 32 bit sys. I have the 13.04 upgrade disc and want to upgrade my memory to 8 Gig. Everytime I install the 8 gig memory, the system won't boot to either version. I downloaded the 64 bit version of both versions of Ubuntu but no results yet. Can anyone offer a suggestion here? I'm kinda lost. Additional Information: The memory was purchased through Acer/Kingston. Recommended for this computer. I watched the video on installing it, so I doubt it's installed wrong. (There's only one way of putting it in). I swapped Op Sys, from Ubuntu 12.04 to 13.04 to 13.10 and now to Xubuntu 13.10 64 bit version. I'm still not having any luck with this upgrade. Would it be necessary to upgrade the CPU? It's just a thought, I don't know what else could keep me from utilizing the new memory. Additional Information: Called Kingston this afternoon, they are sending replacement lower density memory modules 2/4 gig - 8 gig. Tech service says I need to upgrade BIOS to utilize new memory install VIA Dos since it is no longer a windows system. I'm not sure how to go about that but it's a learning process I can live with. Thank you all for your help/support. I realize this isn't a Ubuntu problem but each new user of this op sys, seems to share simular problems and maybe someone can use this info to their advantage.

    Read the article

  • Open source monitoring tool without sending data to "Their Server"

    - by hangu
    I trying to use open source server monitoring tool. I know there are a lot, but I couldn't find what I need.. the basic process of monitoring tool I used to use before was, 1) Install agent in my server which I want to monitor 2) The agent send data to "their server" 3) I can check the health of my server through web browser presented by them. What I need is, avoiding "Step 2". Are there any monitoring tool that I can use? I have Windows 2008 and Linux servers simple feature will be enough like (CPU, Memory, Network..) Thank you

    Read the article

  • Visual Studio 2012 intermittent lockup

    - by user1892678
    Visual Studio 2012 intermittently locks up on me. I notice that devenv.exe jumps to 50% CPU utilization. The CPU stays at this level for a few minutes and then drops. While its at 50% utilization I can still use the IDE. However, intermittently it stops responding (as though it was performing some sort of background process). It only lasts for a few seconds. Also this happens when debugging. I'm running under Windows 7 and I'm using Telerik controls. I've disabled add-ins and extensions and have had no success. Any ideas would be appreciated? Thanks

    Read the article

  • Security against IP spoofing [on hold]

    - by user1369975
    I am pursuing a college project, in which I am running three fake services on three ports to protect the main service (say running at port 80). The concept is that if the user is malicious, he'll try to bring the services down and access the fake services. These ports adopt a blocking process of a connection request and record the IP and port of the client. These are logged and aren't granted access on service on port 80. But what to do if the client spoofs his IP? How can I modify my system?

    Read the article

  • Web hosting deciding to pay for hosting or host your own?

    - by pllee
    Is there a guide out there on how to choose when to pay for web hosting vs. hosting your own? Assuming that root access is a must I would like to compare things like cost, scalability and personal stress. Here is what I could come up with. Paying for web hosting: Benefits: Much cheaper for a small scale. I assume anything under $50 a month would be cheaper than paying for the bandwidth of hosting. No stress in dealing with power outages, server restarts or internet going down. For the most part less busy work involved with setting up. Negatives: Cost goes way up when higher specs are needed (for example monthly cost triples with ability to use 8gb of ram that you can buy for $90 ). This means you have to target a particular ram usage and monitor so your instance stays within the threshold. root access for the most part is a premium. You may get tied into a vendor specific deployment process. Hosting on own : Positives: 100% control of specs and software. When you get past paying for the bandwidth you get much more bang for your buck by building your own machine. Negatives: Doesn't make financial sense if bandwidth costs are more than web hosting costs. Having to deal with power outages, server restarts or internet going down. I think the best of both worlds would be if there was a place that dealt with bandwidth, power outages and server restarts but you provided your own server. Kind of like a 24 hour day care for a server. Does anything like that exist?

    Read the article

  • Tomcat SSL integration issue

    - by small_ticket
    Hi all, I've bought a wildcard ssl certificate from a company, i sent them the csr file and they send me two certificate files namely CA.txt and com_sertificate. I've searched on web and find some tutorials about tomcat and ssl but i can not accomplish with these two files. All that tutorials mention about different files that i don't have. (I asked about this process to the company that i bought certificates but they said they don't have any knowledge about tomcat integration) Is there anyone that has an idea about this? p.s I'm using ubuntu 8.04 server, Java 1.6 and tomcat 6

    Read the article

  • Apache Rewrite Rules

    - by Philip
    I have moved my website from a Wiki to Wordpress and in the process, realised that I have broken links to some popular pages on my website. Is it possible to fix this with a rewrite rule? I need the rule to redirect anything beginning with "^/wiki/(.+)$" to "/$1" but also replacing the "_" character used in MediaWiki slugs to "-" used in Wordpress slugs. For example: http://example.com/wiki/An_Example_Page should be pointed to: http://example.com/an-example-page Is it possible to write such a rewrite rule? Edit: It appears that Wordpress doesn't even care if the "/wiki/" part is removed - provided the slug matches, and that seems to be case-insensitive too. So all I need to do is change the "_" characters to "-" in the slugs.

    Read the article

  • Model View Controller² [closed]

    - by user694971
    I am working on a quite complex web application in Go and I tried to stay in an MVC pattern. However, I ended up having a structure isomorphic to this: /boilerplate The usual boilerplate an application needs to survive in the wilderness /db Layer talking to an SQL DB /helpers Helpers /logic Backend logic, not directly affiliated with any routes, sessions etc. /templates View /web Glue between /logic and /templates. In more dynamic languages the size of /web would be next to zero. But Go doesn't exactly have a RoR integrated so I need a lot of helper structures to feed the templates with data, to process GET/POST parameters and session information. I remember once reading about patterns similar to MVC with one extra letter but Wiki-searching I couldn't find it right now. (BTW currently /logic also contains data retrieval from API services to fill some hash maps; this is no simple task, but that probably belongs into the model, right?) So question: is this structure considered sane? Or does it need some bending to be tagged MVC app?

    Read the article

  • What's the story with ubuntu?

    - by A-ha
    Guys, I've tried to install ubuntu 10.10 desktop ed. on my laptop and unfortunately it didn't detect my keyboard (it detects my mouse) so I couldn't finish installation. Is it something wrong I'm doing? There isn't really much to specify during the installation process and I'm really dissapointed that such trivial task as installation cannot be done without asking a question on a forum. Or maybe just because I'm trying to install it on my laptop I shall download notebook edition? But that sound silly to me.

    Read the article

  • Upgraded from 11.10 to 12.04 now no network access

    - by MadeTheLeap
    A few weeks ago I decided I should enter the Linux world and read that Ubuntu is the most widely used release. I installed version 11.10 and it worked perfectly. Just this past week I decided I would do the upgrade to 12.04. The upgrade process itself worked fine. However, when I logged in I no longer had a network connection. I am running an AMD-based PC with a D-LINK DFE-530TXS network card and as I said, it worked fine in 11.10. I have scoured the Internet and come across a thousand slightly varying solutions, but they are too convoluted for someone new to Linux. Not because I can't follow the steps, but because most of the tools/utilities that are referenced (e.g. to compile, install, etc.) are not available when I use the stated steps in the solutions. So....should I re-install 11.10 or is there hope in getting this version to use the NIC that I know works. I have the latest driver from d-link for my NIC but I have no idea how to 'install' it for Ubuntu 12.04 to use. I know you will require additional information, but I wasn't sure what you would need. Thanks in advance.

    Read the article

  • Monitor disk I/O for specific drive in OS X

    - by raffi
    In my Macbook Pro, I have two internal drives and I've connected a third drive via USB in enclosure. I am currently doing a secure wipe of the external drive and I was interested in seeing what the disk I/O was for that particular drive, but when I use Activity Monitor I only see the total disk usage for all drives combined. Is there any way to monitor a specific drive's total I/O, preferably via a built-in or free method? I don't want to filter by process ID. I just want to filter by mounted disk.

    Read the article

  • Windows 7 startup repair with Truecrypt

    - by PHLiGHT
    I have many computers encrypted with Truecrypt 7.1a (current version) with the whole drive encrypted. Today one of them shows the Windows 7 splash screen for a moment and then goes into startup repair which can't read the encrypted drive. I've tried the various safe modes and what not. The solution is to decrypt the drive and then run startup repair to fix the drive. The problem is that is going to take 50 hours. I've started that process for this situation but I need to have a way to cover myself when this happens to the next PC. What can I do to avoid decrypting the whole drive? I can't be the only one facing this problem so I feel like I must be missing something. Thanks!

    Read the article

  • Do I need to Sysprep Windows 7

    - by Cell-o
    Let's say I have one image and I want to put same image on many identical lenovo laptop's. These new machines have site licence (Office 2010, Windows 7). My questions : 1 - What software do you recommended for this project? e.g Acronis True Image , Clonezilla ,MDT 2 - How do I take the image? after Win 7 and Office 2010 activate process or before ? I'm very confused. e.g : many website saying "you must sysprep when deploying Windows 7 machine." is that correct? if this is correct why?

    Read the article

  • cygwin ssh shortcut on windows desktop

    - by Alex Berkoff
    I have multiple servers that I need to remote into. I prefer Cygwin over Putty to do so. Anyhows - the process of opening my cool Mintty window and then typing the following commands takes too long. PS - I am using a "key" authentication to these servers. First, I double Click Cygwin Terminal shortcut from my windows desktop. Then once the terminal session has booted up, from the command prompt I type the following - $ eval `ssh-agent` $ ssh-add $ ssh <username>@<servername> Please keep in mind that my 'servername' is variable. In fact I have about 10 different server names that could potentially be inserted there - Hence my need for 10 different shortcuts. I would prefer to double click on something from my desktop that will fire up my Mintty and automatically execute the above bash shell commands. Does anyone have or can recommend a nice/elegant solution to do this?

    Read the article

  • Unable to debug javascript?

    - by linkme69
    I’m having some problems debugging an encoded javacscript. This script I’m referring to given in this link over here. The encoding here is simple and it works by shifting the unicodes values to whatever Codekey was use during encoding. The code that does the decoding is given here in plain English below:- <script language="javascript"> function dF(s){ var s1=unescape(s.substr(0,s.length-1)); var t=''; for(i=0;i<s1.length;i++)t+=String.fromCharCode(s1.charCodeAt(i)-s.substr(s.length-1,1)); document.write(unescape(t)); } I’m interested in knowing or understanding the values (e.g s1,t). Like for example when the value of i=0 what values would the following attributes / method would hold s1.charCodeAt(i) and s.substr(s.length-1,1) The reason I’m doing this is to understand as to how a CodeKey function really works. I don’t see anything in the code above which tells it to decode on the basis of codekey value. The only thing I can point in the encoding text is the last character which is set to 1 , 2 ,3 or 4 depending upon the codekey selected during encoding process. One can verify using the link I have given above. However, to debug, I’m using firebug addon with the script running as localhost on my wamp server. I’m able to put a breakpoint on the js using firebug but I’m unable to retrieve any of the user defined parameters or functions I mentioned above. I want to know under this context what would be best way to debug this encoded js.

    Read the article

  • Sprite batching seems slow

    - by Dekowta
    I have implemented a sprite batching system in OpenGL which will batch sprites based on their texture. How ever when I'm rendering ~5000 sprites all using the same texture i'm getting roughly 30fps. The process is as followed create sprite batch which also create a VBO with a set size and also creates the shaders as well call begin and initialise the render mode (at the moment just setting alpha on) call Draw with a sprite. This checks to see if the texture of the sprite has already been loaded and if so it just creates a pointer to the batch item and adds the new sprite coords. If not then it creates a new batch item and adds the sprite coords to that; it adds the batch item to the main batch. if the max sprite count is reached render will be called call end which calls render to render the left over sprites in the batch. and also resets the buffer offset render loops through each item in the batch and will bind the texture of the batch item, map the data to the buffer and then draw the array. the buffer will then be offset by the amount of sprites drawn. I have a feeling that it could be the method i'm using to store the batched sprites or it could be something else that i'm missing but I still can work it out. the cpp and h files are as followed http://pastebin.com/ZAytErGB http://pastebin.com/iCB608tA On top of this i'm also getting a weird issue where then two sprites are batched on after the other the second sprite will use the same coordinates as the last. And then when one if drawn after it is fine. I can't seem to find what is causing this issue. any help would be appreciated iv been sat trying to work this all out for a while now and cant seems to put my finger on what's causing it all.

    Read the article

  • How should a non-IT manager secure the long-term maintenance and development of essential legacy software?

    - by user105977
    I've been hunting for a place to ask this question for quite a while; maybe this is the place, although I'm afraid it's not the kind of "question with an answer" this site would prefer. We are a small, very specialized, benefits administration firm with an extremely useful, robust collection of software, some written in COBOL but most in BASIC. Two full-time consultants have ably maintained and improved this system over more than 30 years. Needless to say they will soon retire. (One of them has been desperate to retire for several years but is loyal to a fault and so hangs on despite her husband's insistence that golf should take priority.) We started down the path of converting to a system developed by one of only three firms in the country that offer the type of software we use. We now feel that although this this firm is theoretically capable of completing the conversion process, they don't have the resources to do so timely, and we have come to believe that they will be unable to offer the kind of service we need to run our business. (There's nothing like being able to set one's own priorities and having the authority to allocate one's resources as one sees fit.) Hardware is not a problem--we are able to emulate very effectively on modern servers. If COBOL and BASIC were modern languages, we'd be willing to take the risk that we could find replacements for our current consultants going forward. It seems like there ought to be a business model for an IT support firm that concentrates on legacy platforms like this and provides the programming and software development talent to support a system like ours, removing from our backs the risks of finding the right programming talent and the job of convincing younger programmers that they can have a productive, rewarding career, in part in an old, non-sexy language like BASIC. Where do I find such firms?

    Read the article

< Previous Page | 515 516 517 518 519 520 521 522 523 524 525 526  | Next Page >