Search Results

Search found 8219 results on 329 pages for 'less'.

Page 123/329 | < Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >

  • Blackberry Won't Sync: Says Processing

    - by Noah
    I have a blackberry storm 2 and I have it set up to sync, just like I've done with all the other ones in the company. They pull the address book/contacts from outlook, and aren't synced to an exchange server. I have everything setup and then when I hit synchronize it flashes "processing" the phone shows it trying to sync and then it is done in less than a second. It did warn me that the computer was not supplying enough power to charge the device and it said that I should make sure the drivers are correct. Also, I was warned that the device might not function properly down in the bottom right hand corner. Any ideas? I've uninstalled and reinstalled the blackberry software off the disc.

    Read the article

  • Is Protune for video only or may be used for photo too?

    - by Green
    I have Hero3+ Black Edition. I can't understand if Protune is for video only or may be applied for photos? The manual says it is both for video and for photo (page 35): High-Quality Image Capture Protune’s high data rate captures images with less compression, giving content creators higher quality for professional productions. Film/TV Rate Standard While shooting in Protune, you have the option of recording video in cinema quality 24 fps to easily intercut GoPro content with other source media without the need to perform fps conversion. But at the same time their site says that Protune is for video only: To record Protune footage, you’ll need to turn Protune ON in your camera’s settings menu. What for is Protune? Photo? Video? Or both?

    Read the article

  • internet speed and routers are controlled by whom

    - by Ozgun Sunal
    i need to learn two things. each is related to other a bit. The first one is, while our LAN speed is usually 100 Mbps or at gigabit levels(very big compared to WAN speeds), WAN speed for instance DSL connections are far less than this. However, we are able to download huge files at those Mb speeds. Isn't this weird? [my real concern is why WAN speed is lower than LAN speeds] Who controls those routers around the large Internet? (while we, as web clients are connected to Internet, packets travel through those routers to the destination network/s).But, are those routers all inside the ISP network and if not, who controls those large numbers of routers?

    Read the article

  • Load on Ubuntu 8.04 LTS high

    - by Paddington
    My Ubuntu 8.04 LTS server periodically has a high load avg spike(once every 2 days) resulting in Apache timing out and virtualy everything even SSH to the server is not possible. When I am on the console and run TOP is see that The load avg increases from less than 1 to above 60 in 15 mins. How can I isolate the cause? top - 09:21:51 up 37 days, 20:18, 6 users, load average: 5.41, 5.53, 5.36 Tasks: 160 total, 2 running, 156 sleeping, 0 stopped, 2 zombie Cpu(s): 65.0%us, 8.8%sy, 0.0%ni, 1.0%id,24.6%wa, 0.3%hi, 0.3%si, 0.0%st Mem: 3989468k total, 3444984k used, 544484k free, 360460k buffers Swap: 11687248k total, 178168k used, 11509080k free, 881772k cached

    Read the article

  • Remote desktop Client versus Web based access to reports and limited data entry

    - by Voyager
    We have a requirement from management to give limited access of our Application to Distributors \ Dealers to look at their account statements in our books of account, enter their purchase requirements (sales order for us). We have given a few of them the RDC who connect to our terminal server and access the reports. This involves licensing of TS Client per each distributor. Is it more better, secure and less costly if a web based application is made to only enter the orders and retrive reports like pending orders, ledgers, receivables etc. Also which is more secure as far as database access is concerened...browser based access or RDC access. Please answer.

    Read the article

  • Browsing is much slower on one PC wired to the same router - why?

    - by deanalt
    Wife is not happy. It takes about 5 seconds to open a google window, versus about 1 second on the faster computer which is about 3 years old itself. Yes, it is an older computer (5 -6 years old, I'd guess), surely with less RAM, but for simple browsing, should it matter? Both are hardwired to the same Netgear Rangemax router. Both use fixed IP addresses. Both are XP. Both have about 8 feet of cable to the router. I have the fastest service my cable provides. Probably irrelevant but ...two newer MACs are connected wirelessely during the summer and they are even faster, but I think that's the difference in browsers. If you could point me to a list of process of elimination steps that would be most appreciated. Thanks Dean

    Read the article

  • Should I go along with my choice of web hosting company or still search?

    - by Devner
    Hi all, I have been searching for a good website hosting company that can offer me all the services that I need for hosting my PHP & MySQL based website. Now this is a community based website and users will be able to upload pictures, etc. The hosting company that I have in mind, currently lets me do everything... let me use mail(), supports CRON jobs, etc. Of course they are charging about $6/month. Now the only problem with this company is that they have a limit of 50,000 files that can exist within the hosting account at any time. This kind of contradicts their frontpage ad of "UNLIMITED SPACE" on their website. Apart from this, I know of no other reason why I should not go with this hosting company. But my issue is that 50,000 file limit is what I cannot live with, once the users increase in significant number and the files they upload, exceed 50,000 in number. Now since this is a dynamic website and also includes sensitive issues like payments, etc. I am not sure if I should go ahead with this company as I am just starting out and then later switch over to a better hosting company which does not limit me with 50,000 files. If I need to switch over once I host with this company, I will need to take backups of all the files located in my account (jpg, zip, etc.), then upload them to the new host. I am not aware of any tools that can help me in this process. Can you please mention if you know any? I can go ahead with the other companies right now, but their cost is double/triple of the current price and they all sport less features than my current choice. If I pay more, then they are ready to accommodate my higher demands. Unfortunately, the company that I am willing to go with now, does NOT have any other higher/better plans that I can switch to. So that's the really really bad part. So my question(s): Since I am starting out with my website and since the scope of users initially is going to be less/small, should I go ahead with the current choice and then once the demand increases, switch over to a better provider? If yes, how can I transfer my database, especially the jpg files, etc. to the new provider? I don't even know the tools required to backup and restore to another host. (I don't like this idea but still..) Should I go ahead and pay more right now and go with better providers (without knowing if the website is going to do really that well) just for saving myself the trouble of having to take a backup of the 50,000 files and upload to a new host from an old host and just start paying double/triple the price without even knowing if I would receive back the returns as I expected? Backup and Restore in such a bulky numbers is something that I have never done before and hence I am stuck here trying to decide what to do. The price per month is also a considerable factor in my decision. All these web hosting companies say one common thing: It is customers responsibility to backup and restore data and they are not liable for any loss. So no matter what hosting company that I would like to go with, they ask me to take backup via FTP so that I can restore them whenever I want (& it seems to be safer to have the files locally with me). Some are providing tools for backup and some are not and I am not sure how much their backup tools can be trusted considering the disclaimers they have. I have never backed-up and restored 50,000 files from one web host to another, so please, all you experienced people out there, leave your comments and let me know your suggestions so that I can decide. I have spent 2 days fighting with myself trying to decide what to do and finally concluded that this is a double-edged sword and I can't arrive at a satisfactory final decision without involving others suggestions. I believe that someone must be out there who may have had such troublesome decision to make. So all your suggestions to help me make my decision are appreciated. Thank you all.

    Read the article

  • split a textfile after each n matches to a new file using sed or awk

    - by ozz
    i tried to split a file in parts of n matches each. The file is just one line and the seperator is '<br>' foo<br>bar<br>.....<br> I just want to split the file in parts, where each file has 100 datasets (text plus <br>)( normaly 100 datasets, but at the end maybe less) I already played around with this ... split-file-in-2-with-sed and this split-one-file-into-multiple-files-based-on-pattern sed.exe -e "^.*.<br>{0,100}/g" < original.txt > first_half.txt The split do not work an the result is only 1 file instead of many.

    Read the article

  • How can HAProxy improve availibility, or "how can I prevent my site from going down"? [closed]

    - by Joe Hopfgartner
    I am aware of what HAProxy does, but what if my HAproxy goes down? Or what if my DNS servers go down? Yes, dns is less the problem. However dns only solves to an IP and an IP is announced via BGP to be routed over some router. What if that router goes down? Of course if I have complicated application servers that are likely to fail HAProxy can significantly improve uptime. But my application isnt. In fact my application may very well just be delivering a small static html file via HTTP. Basically if any user anywhere types in MYDOMAIN.COM, I want the user to get SOMETHING on the screen other than a timeout or DNS resolution error. How can I do that? The point of entry is difficult. The so called "initial closure mechanism".

    Read the article

  • How to make a quiet laptop ? [closed]

    - by psihodelia
    Most modern laptops have very noisy fans. I am looking for a quiet laptop or a small stationary computer which has all its hardware built in a display. Most tasks will be pdf/docs processing, real-time audio processing, web-surfing and skype video chats. Certainly, there is no any fan-less model today; but maybe some of the existed laptops do not switch on their fans so often or implement different solutions? For example, an iPad has no fan at all and it is fast enough for my needs, but it has no normal operating system, so I can't use it for anything but audio chats and web-surfing. Or maybe I can buy a laptop and tweak it to make it absolutely noiseless? Can you recommend me any solution please? Thanks in advance!

    Read the article

  • How can I figure out which PHP extensions aren't being used?

    - by Tom Marthenal
    I manage a server (running Ubuntu) which hosts our client's sites with a few dozen different PHP-based websites, mostly small sites but also some installations of CMSes and forums. I used the get_loaded_extensions() method to see what extensions I have loaded. To help streamline the server (remove unnecessary extensions to make upgrading easier and marginally improve speed), I'd like to remove extensions that aren't being used by any of the sites. I currently have 54 different extensions loaded. I can easily eliminate some of these from the list which I know are used, but others I am less sure about. Is there some way that I can see extensions which have not been used recently?

    Read the article

  • Configuring MPI on 2 nodes

    - by Wysek
    I'm trying to create really simple "cluster" from 2 multicore computers using openmpi. My problem is that I can't find any tutorials on that matter. I don't want to use torque because it's not necessary in my case nevertheless all tutorials give configuration details either about torque or mpd (which doesn't exist in openmpi implementation). Could you give me some tips or links to appropriate manuals? Steps I've already completed: - openmpi installation - network configuration (computers see each other) - ssh password-less login to second computer I tried using machinefiles without further configuration and with just 2 IPs in it. But jobs don't seem to start at all after initialization part. (MPI seems to work because I'm able to scatter jobs on multiple cores of both computers without communication between them).

    Read the article

  • What service can hold CPU at lowest frequency possible when on battery load under Ubuntu?

    - by vava
    When I'm running on battery even with "performance" frequency scaling governor, something regularly lowers CPU speed to it's lowest value. I don't really want that, my AC strip usually in another room so I don't really need to save power. How can I find what service doing that? laptop_mode is disabled so that's not it. Update: Looks like CPU being scaled down only if it is under load. If it is more or less idle, it could stay on any frequency pretty much forever, but once it gets loaded, it quickly jumps to it's lowest frequency. Another update: Something sets maximum frequency CPU can have. Ubuntu launchpad bug 242006

    Read the article

  • Can I make Firefox ignore/interpret font sizes specified in pixels?

    - by Andy
    Hi all, I have an 11.1" notebook display with 1366x768 resolution, which gives it a DPI of 141. I'm running GNOME and have configured the DPI. Everything works OK except web browsing - far too many websites specify their font sizes in pixels, which ends up with very small text on a high DPI display. My ideal solution would be for Firefox to interpret an absolute pixel size in terms of normal DPI and display it appropriately for my DPI (eg scale it by 141/96). Obviously this would cause problems on the occasion where graphics had been pixel-aligned with fonts in some way, but I imagine that would cause me far less of a headache than either reading minute text, or scaling the text manually each time. Any suggestions? TIA, Andy

    Read the article

  • Disable Control-L hotkey on YouTube

    - by Heptite
    I am used to using Ctrll to access my address bar. Unfortunately YouTube "helpfully" defines Ctrll as a hotkey to jump forward several seconds in the currently playing video, so I need a way to stop YouTube from adding that key binding. A Greasemonkey/Scriptish userscript that disables the binding after page load would be acceptable. To be clear, this is while I am on a YouTube video's page, not on a third-party site that has embedded a YouTube video, and the behavior occurs when the Flash player does not have focus. Note: I am aware that Altd does the same as Ctrll in Firefox, but I'm too used to Ctrll and I'd rather not be forced to switch. Edit: OS-specific solutions are less desirable since I use multiple OSes; in-browser solutions are preferred.

    Read the article

  • Getting back the old alt-tab windows switching behavior in Windows 7?

    - by Carlos A. Ibarra
    When you run more than 6 applications on Windows 7 and you press alt-TAB, icons representing the first 6 applications and the desktop appear on the first row of the grid and you can cycle with alt-TAB-TAB... through the 6 most recently used windows the usual way, but the 7th and other less recently used windows don't follow the same rules. Instead they get grouped together according to their application but disregarding whether they were recently used or not. This new behavior is mentioned here. I am very used to the old way of cycling and the new system is driving me crazy. I tend to have 20 or so windows open at one time and I frequently need to alt-tab to the 7th or 8th window on the stack but it doesn't work the same anymore. Does anyone know how to put back the old behavior, so that alt-tab-tab-tab... goes through the whole list in most-recent to least-recent order?

    Read the article

  • How to achieve redundancy across data centers?

    - by BrandonBT
    I have a LAMP server with a lot of hardware redundancy built in. I am not worried about the server becoming unavailable. What I am worried about, however, are potential network issues in the data center the server is in. What I would like to have is another server in another data center for redundancy. Load balancing is less of a concern. With that said, I am relatively clueless on two points: How to have two servers in two geographically separate data centers that have exactly the same data, in terms of both files and MySQL databases. How to ensure that all traffic coming into one data center are automatically transferred to the other database in the case of a network or server failure at the first data center. Any guidance on how to accomplish the above two problems would be greatly appreciated.

    Read the article

  • How can Automator or Applescript rotate 500,000 images in folders and subfolders?

    - by Ludi
    I have very specific question. I've got 500,000 images that sit in 98 subfolders. Some of the subfolders contain 25,000 images. Is there any script/automator workflow that I can use? I was trying to create a workflow: Ask for Finder Items Get Folder Contents (tick repeat for each subfolder found) Rotate images It works OK, but only for these folders where there is less than 4,096 files: I normally receive following error message: Rotate images failed - 1 error too many arguments (12019) -- limit is 4096 Is there any way to increase this limit or create completely different apple script? I really hope that someone will help me with this one.

    Read the article

  • Server Backup Solutions - compiling?

    - by Webnet
    I've been researching backup solutions for a LAMP environment to backup our databases and files alike. I'm looking for open source with a UI (so I'm less likely to screw it up). I downloaded http://www.bacula.org/en/ and a few others but they all talk about compiling first.... this doesn't seem like something I should need to do.... is there a linux package that maybe handles backups that I don't know about? I should also specify I'm looking to setup a backup server which backs up from several locations.

    Read the article

  • RocketRaid gives me stuttering computer

    - by Dan
    I have a SANS DIGITAL TowerRAID TR8M-BP (with the RocketRaid 622 raid device) with 5 x SAMSUNG Spinpoint F4 HD204UI 2TB 5400 RPM SATA 3.0Gb/s harddrives in a raid 5. It connects via 2-esata cables. My drivers are up to date. When I'm writing data to the drives my computer stutters (meaning freezes for a 1/2 second every 2 seconds or so). My screen freezes, my music goes into a loop, its really annoying. I get the same thing in windows 7 as I do in linux. The only difference is it seems to happen less in linux, but occassionally linux will crash (I've never had linux crash for any other reason, so I'm assuming they have poor linux drivers and kernal mod). Any tips for how to deal with this? Thanks

    Read the article

  • trouble loggin into a Mac share from a Windows PC on the network

    - by villares
    I have this mixed network and usually log into the Macs from the Windows XP home machines and vice versa. I have no real networking knowledge, things just seem to work, more or less, with the default settings. Now I've got a new Snow Leopard Mac with a shared folder (added the user names of the Windows users at the sharing preferences) and the trouble is some machines can open the share and others can't. I can't see the difference. It feels like some Windows machines have a "cache" and won't ask for the share password, just deny access. I can also see old shares proposed at the Windows "add network place wizard".

    Read the article

  • What methods are there to configure puppet to serve resources for multiple environments?

    - by cclark
    I seem to come across two ways for using puppet in multiple environments: 1) Install a puppetmaster in each environment and only update the recipes from source control for that environment when ready to deploy the recipes in that environment. 2) Use one puppetmaster and use a variable in the puppet.conf of each client to specify the environment and then in the puppetmaster specify a different modulepath for each environment and each of those paths is updated to the branch of the recipe repository intended for that environment (e.g. dev, staging, production). Only running one puppetmaster seems like it is one less piece of infrastructure to keep running but there is some additional complexity in the configuration. Are there additional pros or cons to one of these methods or something which I'm missing entirely?

    Read the article

  • Using socat to exec php cli

    - by RoyHB
    There are multiple client programs that periodically connect to a port on my server and send a single line of text. When a connection to the port is made I need to start a PHP CLI script that processes the data. There may be many of the remote scripts running/connecting at more or less the same time so I think it would be best if socat forked a process for each connection to run the script. I've gotten socat to do most of what I need, using the command socat tcp-l:myport,fork exec:mypath/socatTest.php I can read the input on php://stdIn. All is good. The problem is that the process doesn't seem to fork, so if a second external program sends data while another is doing the same it gets a connection refused error. Where have I gone wrong?

    Read the article

  • Determining the State of a User using their Hostname

    - by PhpMyCoder
    Not sure if this is the right SE site. I figured this question doesn't belong on SO, but if you think it doesn't belong here either, I apologize. I've been looking into determining the location, specifically the state, of a user accessing my website. One of the options I've known about for a while is the GeoIP City Database, however this isn't the most cost effective solution and I'm cheap so I was looking for a less expensive way. Something that occurred to me was that my state was in the public hostname assigned to me by Comcast: (Dash Separated IP).hsd1.ma.comcast.net Could it be possible that other ISPs follow this same pattern of inserting the state abbreviation into their users' hostnames? I've been looking around for a list of hostnames for other ISPs, but I haven't found anything. Can anyone verify that this holds true for other major ISPs?

    Read the article

  • How can I tell how many bits my ssh key is?

    - by yairchu
    I already created an ssh key for myself sometime in the past. I don't remember "how many bits" it is. How can I tell? I'm wondering because I'm using hosting at nearlyfreespeech.net and their faq says: Can I configure my ssh connection to use a public key? ... we will not install keys that have a length less than 1536 bits ... We prefer that you use a key at least 2048 bits in length, and if you are generating a new key, the recommended length is 4096 bits.

    Read the article

< Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >