Search Results

Search found 8250 results on 330 pages for 'dunn less'.

Page 123/330 | < Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >

  • How to view multiple log files as one file in unix/linux

    - by user42679
    Hi, I was wondering if there is a convenient way in linux/unix to read multiple log files as one. More specifically, I would like to view a sequence of log files (app.log, app.log.1 app.log.2, etc) as one big file using normal unix tools (vi, less, etc). When the EOF is read the tool will automatically move to the beginning of the next file. During my work I have to analyze uat/prod logs to investigate and solve problems. The fact that I need to traverse many log files disturbs my work and causes delays. Any ideas?

    Read the article

  • What can impact the throughput rate at tcp or Os level?

    - by Jimm
    I am facing a problem, where running the same application on different servers, yields unexpected performance results. For example, running the application on a particular faster server (faster cpu, more memory), with no load, yields slower performance than running on a less powerful server on the same network. I am suspecting that either OS or TCP is causing the slowness on the faster server. I cannot use IPerf , unless i modify it, because the "performance" in my application is defined as Component A sends a message to Component B. Component B sends an ACK to component A and ONLY then Component A would send the next message. So it is different from what IPerf does, which to my knowledge, simply tries to push as many messages as possible. Is there a tool that can look at OS and TCP configuration and suggest the cause of slowness?

    Read the article

  • Should I go along with my choice of web hosting company or still search?

    - by Devner
    Hi all, I have been searching for a good website hosting company that can offer me all the services that I need for hosting my PHP & MySQL based website. Now this is a community based website and users will be able to upload pictures, etc. The hosting company that I have in mind, currently lets me do everything... let me use mail(), supports CRON jobs, etc. Of course they are charging about $6/month. Now the only problem with this company is that they have a limit of 50,000 files that can exist within the hosting account at any time. This kind of contradicts their frontpage ad of "UNLIMITED SPACE" on their website. Apart from this, I know of no other reason why I should not go with this hosting company. But my issue is that 50,000 file limit is what I cannot live with, once the users increase in significant number and the files they upload, exceed 50,000 in number. Now since this is a dynamic website and also includes sensitive issues like payments, etc. I am not sure if I should go ahead with this company as I am just starting out and then later switch over to a better hosting company which does not limit me with 50,000 files. If I need to switch over once I host with this company, I will need to take backups of all the files located in my account (jpg, zip, etc.), then upload them to the new host. I am not aware of any tools that can help me in this process. Can you please mention if you know any? I can go ahead with the other companies right now, but their cost is double/triple of the current price and they all sport less features than my current choice. If I pay more, then they are ready to accommodate my higher demands. Unfortunately, the company that I am willing to go with now, does NOT have any other higher/better plans that I can switch to. So that's the really really bad part. So my question(s): Since I am starting out with my website and since the scope of users initially is going to be less/small, should I go ahead with the current choice and then once the demand increases, switch over to a better provider? If yes, how can I transfer my database, especially the jpg files, etc. to the new provider? I don't even know the tools required to backup and restore to another host. (I don't like this idea but still..) Should I go ahead and pay more right now and go with better providers (without knowing if the website is going to do really that well) just for saving myself the trouble of having to take a backup of the 50,000 files and upload to a new host from an old host and just start paying double/triple the price without even knowing if I would receive back the returns as I expected? Backup and Restore in such a bulky numbers is something that I have never done before and hence I am stuck here trying to decide what to do. The price per month is also a considerable factor in my decision. All these web hosting companies say one common thing: It is customers responsibility to backup and restore data and they are not liable for any loss. So no matter what hosting company that I would like to go with, they ask me to take backup via FTP so that I can restore them whenever I want (& it seems to be safer to have the files locally with me). Some are providing tools for backup and some are not and I am not sure how much their backup tools can be trusted considering the disclaimers they have. I have never backed-up and restored 50,000 files from one web host to another, so please, all you experienced people out there, leave your comments and let me know your suggestions so that I can decide. I have spent 2 days fighting with myself trying to decide what to do and finally concluded that this is a double-edged sword and I can't arrive at a satisfactory final decision without involving others suggestions. I believe that someone must be out there who may have had such troublesome decision to make. So all your suggestions to help me make my decision are appreciated. Thank you all.

    Read the article

  • What is the best IP/Subnet set up strategy for a multi-server webhosting setup?

    - by Roy Andre
    Sorry for the mixed-up title, but let me try to explain better: We run a hosting solution, which until now has supported shared hosting and VPSes. Easy enough. We are now getting larger clients which require a more complex setup. We have more or less settled the server-setup itself, which will consist of: 1-2 Frontend Proxy/Load balancing servers 2+ Application servers 1 Database server 1 optional Memcached server The issue we are dealing with is to agree on a flexible and easy-to-maintain IP setup. So far we've been into VLAN'ing the internal servers in its own subnet, we've though of assigning an official IP to each server, and so on. What will be the best approach here? Any best practices? Using one official IP on the Frontend server, and then just set up an internal subnet for the servers behind that? We could then just NAT in any eventual sources required to access for instance the DB server directly over 3306.

    Read the article

  • Server Backup Solutions - compiling?

    - by Webnet
    I've been researching backup solutions for a LAMP environment to backup our databases and files alike. I'm looking for open source with a UI (so I'm less likely to screw it up). I downloaded http://www.bacula.org/en/ and a few others but they all talk about compiling first.... this doesn't seem like something I should need to do.... is there a linux package that maybe handles backups that I don't know about? I should also specify I'm looking to setup a backup server which backs up from several locations.

    Read the article

  • Haproxy Slow Reload in DB Mode

    - by com
    Recently I started using great tool for load balancing - Haproxy. There is only one disturbing thing that I cannot figure out how to deal with it. We use haproxy for load balancing mysql traffic. When there is a lot of traffic and many connection it takes ages for haproxy to reload (~ 30 min), with less traffic it doest reload within 1 min. I do reload with: service haproxy reload Of course if I need to do an urgent change in configuration I expect haproxy to do reload very fast. Killing haproxy instances waiting for disconnection causes to disconnection of msyql connections. It looks like that I made mistake in settings of haproxy or in settings of application. If you know how to solve this please help me. Thanks!

    Read the article

  • Is it bad to have a very full hard drive on a high traffic database server?

    - by MikeN
    Running an Ubuntu server with MySQL for a high traffic production database server. Nothing else is running on the machine except the MySQL instance. We store daily database backups on the DB server, is there any performance hit or reason why we should keep the hard disk relatively empty? If the disk is filled up to 86%+ with the database and all of the backups, does it hurt performance at all? So would the DB server running with 86-90%+ full capacity perform less well in any way than the server running with only a 10% full disk? The total disk size on the server is over 1 TB so even 10% of the disk should be enough for basic O/S swapping and such.

    Read the article

  • What service can hold CPU at lowest frequency possible when on battery load under Ubuntu?

    - by vava
    When I'm running on battery even with "performance" frequency scaling governor, something regularly lowers CPU speed to it's lowest value. I don't really want that, my AC strip usually in another room so I don't really need to save power. How can I find what service doing that? laptop_mode is disabled so that's not it. Update: Looks like CPU being scaled down only if it is under load. If it is more or less idle, it could stay on any frequency pretty much forever, but once it gets loaded, it quickly jumps to it's lowest frequency. Another update: Something sets maximum frequency CPU can have. Ubuntu launchpad bug 242006

    Read the article

  • Any way I can correct DNS spoofing against our domain

    - by brandon
    This morning I found out that our domain and subdomains have been poisoned on the 4.2.2 and 4.2.2.1 DNS servers along with others I think, though I have not confirmed others yet. Using OpenDNS resolution works correctly. I have updated our local DNS servers and cleared their cache which has fixed things internally. The issue is that the domain is public facing and customers are having problems. We are the authoritative DNS server for the domain and all that is under our control. What I don't know how to do is fix the name servers out of our control. Is there something we can do on our end? At the moment the only workaround I can think of is to ask customers to change their DNS to OpenDNS which is not very practical. The other workaround would be to change our TLD, which is less practical.

    Read the article

  • Remove a known network from Windows 8

    - by Edward Brey
    When Windows 8 detects a network based on the assigned IP address, netmask, default gateway, etc., it remembers the network along with the setting you give it as a public or private network. If you change the configuration of a network (e.g. reconfigure your router), Windows may determine you are on a new network and assign it a name of Network 2 or YourAPN 2. This less-than-friendly name shows up in many places in the Windows 8 UI, but unlike the good old days of Windows 7, there doesn't appear to be any UI to merge or delete these networks. What's the best way to merge or delete networks you don't want?

    Read the article

  • nginx static file buffer

    - by Philip
    I have a nfs which several frontend-servers are connected to for making the files stored on the nfs available for http downloads. It looks like I have problems with the way apache is serving the files, there seems to be a very small buffer or no buffer at all which results in a lot disk seeks. I did some testing with loading the whole requested file into memory at once and serve it to the client from memory. With this technique I need less disk seeks for a download stream. Since I don't want to implement this by myself for production use I thought that I could maybe use nginx for that because the documentation says that it uses buffers for static file serving. Is it possible to increase the buffer size to a few mb, if so which config parameter do I have to change for this? Has anyone experience with large buffers for static file serving? Is there a better way to reduce disk seeks?

    Read the article

  • people_dl_import shows millions of records

    - by amit lohogaonkar
    We have a situation now on prod in sharepoint 2007 based intranet platform and it shows thousands of records under people_dl_import category with format spsimport://?$$dl$$/domain1/domain2/domain3/ Also import was not stopping and added millions of records in database and was on verge of disk full. On other servers like dev we have very less data in this category and format is also like spsimport://doaminname?$$dl$$?... which is good and has only 6000 rows and in prod its 2 millions Crawled under people_dl_import category. I need to know the cause of this garbage data and how to fix it. I tried resetting content source and I will do full import in this weekend to see if this garbage data gets cleared. Any idea on cause for thiss issue?

    Read the article

  • how to make a small image become really huge

    - by DennyHalim.com
    all webmaster should already know about hotlinking stuffs. and we know how to ban those bad referer too... but i want to get revenge... i want to replace the hotlinked images with one huge image with few megs in size. i have found one good image. yet it less than 100k. i already use it to replace all bad hotlinkers. how can i convert this image to become few megs?

    Read the article

  • How do I join two worksheets in Excel as I would in SQL?

    - by Joel Coehoorn
    I have two worksheets in two different Excel files. They both contain a list of names and addresses. One is a master list that includes other fields, and the other is a list that only includes name and address and an id column that was pared down by another office. I want to use the 2nd list to filter the first. I know how I could do this very easily with a database inner join, but I'm less clear on how to do this efficiently in Excel. How can join two worksheets in Excel? Bonus points for showing how to do outer joins as well, and I would greatly prefer konwing how to do this without needing a macro.

    Read the article

  • Doesn't DNS diversity negatively affect performance? Why/how?

    - by cnst
    If you look at the press releases of various orgs that run the internet, you can see them praise the fact that now they run root server X in city Y, as if that magically makes everyone in city Y get all the relevant resolutions from the local server X, instead of going 200ms across the oceans and lands to other continents for resolutions. Similarly, the zones of some geographical domain names, like .ru, are being mirrored not just within Europe, but also, for example, in Hong Kong, which is no more, no less, but is about 300ms away from central Europe, since the traffic is often crossing the two oceans on each way. Doesn't all of this negatively affect DNS performance? Isn't it more of a liability to have a diverse pool of geodispersed authoritative servers, especially if your target audience is quite geographically concentrated? Perhaps a better question is, are there any DNS resolvers that use something better than the naive round-robin for choosing which authoritative server to contact?

    Read the article

  • Centos OS resource footprint vs Ubuntu package refresh

    - by webworm
    I am trying to determine which distro to sink my teeth into. I am new to the Linux world and would like to choose a distro to focus on. I have read that CentOS uses less resources than Ubuntu, which is an issue for me since I am renting a VPS and resource cost is an issue. I have also read that Ubuntu has more up-to-date packages which is a concern for me as I want to use PHP and some packages that have a fair amount of dependencies. I am not using Linux as a desktop OS, rather just as a server for Apache, PHP, PERL, and Java development. What would be the best choice for a server OS? CentOS or Ubuntu? Are the resource requirements that different? Are the packages that different between the two? Thanks.

    Read the article

  • How to make a quiet laptop ? [closed]

    - by psihodelia
    Most modern laptops have very noisy fans. I am looking for a quiet laptop or a small stationary computer which has all its hardware built in a display. Most tasks will be pdf/docs processing, real-time audio processing, web-surfing and skype video chats. Certainly, there is no any fan-less model today; but maybe some of the existed laptops do not switch on their fans so often or implement different solutions? For example, an iPad has no fan at all and it is fast enough for my needs, but it has no normal operating system, so I can't use it for anything but audio chats and web-surfing. Or maybe I can buy a laptop and tweak it to make it absolutely noiseless? Can you recommend me any solution please? Thanks in advance!

    Read the article

  • trouble loggin into a Mac share from a Windows PC on the network

    - by villares
    I have this mixed network and usually log into the Macs from the Windows XP home machines and vice versa. I have no real networking knowledge, things just seem to work, more or less, with the default settings. Now I've got a new Snow Leopard Mac with a shared folder (added the user names of the Windows users at the sharing preferences) and the trouble is some machines can open the share and others can't. I can't see the difference. It feels like some Windows machines have a "cache" and won't ask for the share password, just deny access. I can also see old shares proposed at the Windows "add network place wizard".

    Read the article

  • How can HAProxy improve availibility, or "how can I prevent my site from going down"? [closed]

    - by Joe Hopfgartner
    I am aware of what HAProxy does, but what if my HAproxy goes down? Or what if my DNS servers go down? Yes, dns is less the problem. However dns only solves to an IP and an IP is announced via BGP to be routed over some router. What if that router goes down? Of course if I have complicated application servers that are likely to fail HAProxy can significantly improve uptime. But my application isnt. In fact my application may very well just be delivering a small static html file via HTTP. Basically if any user anywhere types in MYDOMAIN.COM, I want the user to get SOMETHING on the screen other than a timeout or DNS resolution error. How can I do that? The point of entry is difficult. The so called "initial closure mechanism".

    Read the article

  • Is it possible to turn a shortcut to a folder into a menu of items in that folder?

    - by MrVimes
    I vaguely remember seeing something on the internet that showed it was possible to turn any shortcut to a folder into a menu, as long as that shortcut was in the start menu or on the quicklaunch bar (i.e. somewhere that allowed menu functionality) Does anyone know if this is possible? And if so, how to do it? I'd like to be able to do this... With a link in my quicklaunch area... I remember it had something to do with renaming the shortcut with along string of characters placed between '{' and '}'. I realize how picky this request is as I have more or less acheived what I am looking for by placing the 'desktop' toolbar on my start bar. But I'd rather it be an icon in my quicklaunch. Just humour me :)

    Read the article

  • Win XP Pro, IIS 5.1, PCI Compliance

    - by Mudman266
    I have a client that was scanned and determined not to be PCI Compliant. I looked and they had IIS setup to allow a program from central office to push/pull info from their server. Many of the reasons they failed appeared to have been fixed in SPs (they were on SP2) or security updates. I fully patched the server to (Windows XP Pro) SP3 with all optional updates. I had them scan again and again they failed with only one less vulnerability that I manually corrected (server was showing debugging/error messages). The main issue I'm having is that when I research the CVE code for each error, they say they are fixed in SP2 and up. I'm wondering if I need to remove IIS and resetup since I have patched to SP3. Any ideas?

    Read the article

  • Remote desktop Client versus Web based access to reports and limited data entry

    - by Voyager
    We have a requirement from management to give limited access of our Application to Distributors \ Dealers to look at their account statements in our books of account, enter their purchase requirements (sales order for us). We have given a few of them the RDC who connect to our terminal server and access the reports. This involves licensing of TS Client per each distributor. Is it more better, secure and less costly if a web based application is made to only enter the orders and retrive reports like pending orders, ledgers, receivables etc. Also which is more secure as far as database access is concerened...browser based access or RDC access. Please answer.

    Read the article

  • Access denied errors in windows server 2008 for system administrator

    - by NLV
    Hello I've a HP Pavilion DV4 with windows Server 2008 R2 enterprise. My system is connected to a domain. I login using my domain account. I am the administrator for the local machine. But I'm getting access denied everywhere (file system, running commands, using visual studio..literally everywhere..) in my machine. What could be the problem? Any ideas? Please tell me if you need more information as I'm just a developer with less knowledge in system administration.

    Read the article

  • recommendation for configuration for a multi-core guestOS

    - by reidLinden
    Hi there, I've just received an upgraded Host machine, and am looking to push some of those advances to my workstations Guest OS(s). In particular, I used to have a single processor, with 2 cores, so my guestOS only had 1/1. Now, I've got a single processor with 8 cores, so I'm curious about what would be recommended for my GuestOS now? 1 processor/4 cores? 2 processors/2Cores? 4 processors/1 core? My instinct says to stick with the number of physical processors (or less), but, is that based on reality? I spent a good while looking for an answer to this, but perhaps my google-karma isn't in my favor today. Suggestions?

    Read the article

  • Proccess of carrying out a BER Test

    - by data
    I am subscribed to an ISP supplying a 3meg ADSL line. Lately (for the last 4 weeks) speeds have dropped from the usual average downstream speed of ~250kbps to just 0.14Mbps (according to speedtest.net) and employees are complaining about lack of access to the server. I have been calling customer support and logging calls for the last 3 weeks, but they have been unable to determine the source of the problem other carrying out a few bitstream tests and checking the DHCP renewal times. I am going to call back and suggest carrying out a BER test. What type of equipment is needed to carry out this test? I have access to a wide range of Cisco networking equipment. Other: We dont need a leased line as there are less than ten employees.

    Read the article

< Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >