Search Results

Search found 5290 results on 212 pages for 'refresh rate'.

Page 109/212 | < Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >

  • excel date range help please

    - by Mark
    I need help with either a formula or a macro to help automate a grade sheets dates. We have class every monday or wednesday only. I would like to vlookup from a input table of each quarters date range (example Sept. 10 - Oct 24 and the code auto insert the date of every monday and wednesday in a row at the top of my grade sheet. Every year I use the same excel workbook I built to average and rate the grading with no problem, however i can't seem to get this one right. Any help would be greatly appreciated. Currently I have to enter each date by hand. Thanks Again for any help.

    Read the article

  • VPS stops responding every now and again

    - by Or W
    I have a Linode vps that I use to host some of my websites on. It's Ubuntu based and it's up to date in terms of all packages. I don't have any cron jobs scheduled or any automatic processes. I host a few (up to date) wordpress blogs there that have very little traffic altogether. Every day (at a different time) my server stops responding, I can't SSH to it, web access is getting timed out and it just dies until I reboot it through the Linode manager. On the linode dashboard I can see that the CPU is not very high (2-3%) Incoming/Outgoing traffic is on 0 and the IO count has a spike just before the server stops responding (SWAP IO is at 2k and IO Rate is at 5k). When I reboot the server everything is just fine. I'm trying to figure out a way to analyze what's going on at these random times where the server freezes up. How can I determine the problem?

    Read the article

  • mysql server, open 'dead' connections

    - by Jeff
    my basic question is what kind of impact does this have on the server.. lets say for example, there is an older program in my company that opens connections to a mysql database server at a high rate (everything they do with the application basically opens a server connections) however, this application was not designed in the way to dispose of the connections after they where created.. alot of the time the connections remain open but are never used again, open 'dead' connections i guess you could say. they just remain connected until the server times them out, or until an admin goes in and removes the sleeping connections manually. im guessing this could be responsible for sometimes not able to connect errors etc. that we receive from other systems that try to access the mysql database? (connections limit reached) could this slow down the server as well? curious what all this could exactly cause. thanks!

    Read the article

  • Shaping with shorewall complex shaper not work (or I don't understand principle of operation)

    - by strangeman
    I have router (Debian 6) with 2 network interfaces (and 1 virtual tun interface): eth0 - localnet, 192.168.1.0/24, router ip is 192.168.1.1 eth1 - internet tun0 - openvpn to central office. openvpn network - 10.1.0.0/24, central office network - 192.168.0.0/24 I need shape all traffic, which moves 192.168.1.0/24-192.168.0.1:6666 and 192.168.1.0/24<-192.168.0.1:6666, and restrict its speed to 200kbit. Now, I have this configuration, but its not work: tcdevices (set up interface parameters) #INTERFACE IN-BANDWITH OUT-BANDWIDTH eth0 100mbit 100mbit #LAST LINE -- ADD YOUR ENTRIES BEFORE THIS ONE -- DO NOT REMOVE tcrules (mark all traffic, which move on 6666 port) #MARK SOURCE DEST PROTO PORT(S) 1 0.0.0.0/0 0.0.0.0/0 tcp 6666 #LAST LINE -- ADD YOUR ENTRIES BEFORE THIS ONE -- DO NOT REMOVE tcclasses (shape all marked traffic) #INTERFACE MARK RATE CEIL PRIORITY OPTIONS eth0 1 200kbit 200kbit 2 eth0 255 9*full/10 full 1 default #LAST LINE -- ADD YOUR ENTRIES BEFORE THIS ONE -- DO NOT REMOVE Where is my mistake?

    Read the article

  • Network tries to reindentify itself now and then

    - by Don Young
    When the computer starts up it connects itself to the router (ZTE 4G router) automatically, but after I have surfed the web for a while, it tries to identify itself, meaning you get that little blue circle next to the little screen at the corner. When browsing it does not cause any problems, but if I'm streaming a video the video will then stop, I'll have to refresh the page to make the video start again. And if I'm playing a game, LoL for example, the game will freeze for about 2 seconds then continue again (due to lost connection to the internet). I have no virus on my computer, although I had before. I have reset my router, restarted my computer, updated my ethernet driver, checked so that the IPv4 is set on automatic and tried different router channels. I have had this router for a few months and the problem just started recently. Here is the ipconfig screen:

    Read the article

  • error when using OWA access to mail server exchange 2010

    - by e0594cn
    Suddenly it will come out the below error when accessing the exchange 2010 mail server using OWA after clicking sign in button on initial page? ***The website cannot display the page HTTP 500 Most likely causes: •The website is under maintenance. •The website has a programming error. What you can try: Refresh the page. Go back to the previous page. More information This error (HTTP 500 Internal Server Error) means that the website you are visiting had a server problem which prevented the webpage from displaying. For more information about HTTP errors, see Help.* Any suggestion? Thanks!

    Read the article

  • Yum through http proxy

    - by eodchop
    I have several Fedora 13 servers that have to connect through an http proxy for yum updates. All port 80 traffic has to be routed through this proxy. I have setup the proxy server in the network settings GUI. I can browse the internet just fine. I have also setup my proxy information in /etc/yum.conf as follows: proxy=http:proxy.largecorp.corp/accelerated_pac_base.pac proxy_user=user proxy_password=password I then added the export HTTP_PROXY="http:proxy.largecorp.corp/accelerated_pac_base.pac" to /etc/bashrc and sourced the file. When i run yum update: Loaded plugins:presto, refresh-packagekit Error: Cannot retrieve repository metadata (repomd.xml) fro repository: fedora. Please verify its path and try again. All of the repo urls are the defaults, as this is a fresh install.

    Read the article

  • When should I use SATA 6gb/s?

    - by Gili
    I purchased a Baraccuda hard-drive (model ST3000DM001) that supports a maximum read transfer rate of 210 MB/s and SATA 1.5/3/6 Gb/s. My motherboard has a limited number of 6 Gb/s ports so I'd like to reserve them for when it's really necessary. When does a hard-drive benefit from a SATA 6 Gb/s port? Doesn't it require a transfer speed of at least 375 MB/s to surpass the limit of SATA 3 Gb/s? Are there any other benefits of SATA 6 Gb/s vs 3 Gb/s ports?

    Read the article

  • Set an Excel cell's color based on multiple other cells' colors

    - by Lord Torgamus
    I have an Excel 2007 spreadsheet for a list of products and a bunch of factors to rate each one on, and I'm using Conditional Formatting to set the color of the cells in the individual attribute columns. It looks something like this: I want to fill in the rating column for each item with a color, based on the color ratings of its individual attributes. Examples of ways to determine this: the color of the category in which the item scored worst the statistical mode of the category colors the average of the category ratings, where each color is assigned a numerical value How can I implement any or all of the above rules? (I'm really just asking for a quick overview of the relevant Excel feature; I don't need step-by-step instructions for each rule.)

    Read the article

  • Is 50% download speed on a wireless G network normal?

    - by Bartlomiej Skwira
    I have a wired connection of about 36Mb/s, but my wireless speed is max at about 18-19Mb/s. I have a WRT54G-TM (T-Mobile, 802.11G) router with DD-WRT firmware - I've upgraded it to latest build. Done some settings changes: changed channel - 13 wireless network mode - G-only ACK Timing - 0 Fragmentation Threshold and RTS Threshold - 2304 Basic Rate - All Signal/Noise ratio: -46/-94, signal quality ~50-60%. Is this normal with G networks? Edit: The AP is located about 2 meters from laptop, no walls or metal objects, but its next to a TV. I've done a channel scan (had problems locating it, go to "Status - Wireless - Site survey" - lame naming) and everybody else is on channels 1 and 6. Switched to channel 11 but it didn't help. As for trasmit power I got best results with default 71mw. The antenna might be a factor, I'm using the default 2 antennas.

    Read the article

  • Migrate servers without losing any data / time-limited MySQL dump?

    - by inac
    Is there a way to migrate from an old dedicated server to a new one without losing any data in-between - and with no downtime? In the past, I've had to lose MySQL data between the time when the new server goes up (i.e., all files transferred, system up and ready), and when I take the old server down (data still transferred to old until new one takes over). There is also a short period where both are down for DNS, etc., to refresh. Is there a way for MySQL/root to easily transfer all data that was updated/inserted between a certain time frame?

    Read the article

  • Network share not always available

    - by CapSoft
    Hello everybody, we have a windows 2003 server with a shared directory. I've seen this thread but this wasn't any help: http://superuser.com/questions/58890/the-specified-network-name-is-no-longer-available I have a ping -t running from 3 pc's (vista and two windows 7) they all work. the things appear when two users enter the network share then this 'network share is no longer available' appears and the explorer windows turn white. after f5 or refresh the shared directory is back. this is really strange. there is no anti virus or kasparsky running on either end. this is all in a LAN. the internet connection is really stable, so it's really strange can it be a router issue? I have checked the eventlog on the server for diskfailure related messages, but there are none.

    Read the article

  • pine sometimes not update new received emails

    - by Tim
    Hi, I am using pine on a KUbuntu linux email server to check my emails. Most of the time when I am under "Inbox", the list of emails is updated automatically so I don't have to do things like pushing the refresh button when I am under web. But sometimes it does not and I have to type "<" to go to the upper level and then select "Inbox" to enter again into the inbox directory to find out that I do receive new emails that are not shown previously. I was wondering what the problem is? Thanks and regards!

    Read the article

  • pine sometimes not update new received emails

    - by Tim
    Hi, I am using pine on a KUbuntu linux email server to check my emails. Most of the time when I am under "Inbox", the list of emails is updated automatically so I don't have to do things like pushing the refresh button when I am under web. But sometimes it does not and I have to type "<" to go to the upper level and then select "Inbox" to enter again into the inbox directory to find out that I do receive new emails that are not shown previously. I was wondering what the problem is? Thanks and regards!

    Read the article

  • How to make sure clients update their browser cache when my website is updated?

    - by user64204
    I am using the HTTP 1.1 Cache-Control header to implement client-side caching. Since I update my website only once a month I would like the CSS and JS files to be cached for 30 days with Cache-Control: max-age=2592000. The problem is that the 30-day period defined by Cache-Control doesn't coincide with the website update cycle, it starts from the moment the users visit the site and ends 30 days later, which means an update could occur in the meantime and users would be running with outdated content for a while, which could break the rendering of the website if for instance the HTML and CSS no longer match. How can I perform client-side caching of content for periods of several days but somehow get users to refresh their CSS/JS files after the website has been updated? One solution I could think of is that if website updates can be schedule, the max-age returned by the server could be decreased every day accordingly so that no matter when people visit the website, the end of caching period would coincide with the update of the website, but changing the server configuration every day goes against one of my sysadmin principles (once it's running, don't touch it).

    Read the article

  • wmp12 refuses to convert files when syncing

    - by Carbonara
    I have quite a large music collection of MP3s at 320kbps and some WMA files at various bitrates. I'm trying to sync some of them to my HTC Desire and am quickly running out of space. WMP12 has options to set, per device you wish to sync, to auto convert to a lower bitrate whilst syncing. I have set this to auto convert files to a maximum bitrate of 192kbps, that way I can fit more music on the device but keep the files on my PC at the higher rate. See these screens to see that it's set up correctly. Only problem is, surprise surprise for a Microsoft product, it doesn't actually work. Any file that is greater than 192kbps, MP3 or WMA simply fails, doesn't get converted or copied to the device. The message in the sync log displays the rather unhelpful message "error" and that's it. Any help would be appreciated. I'm not really looking for alternative software solutions I'd like to get this working since that's what it's supposed to do.

    Read the article

  • HAProxy overload protection

    - by user2050516
    using the HAProxy, would it be possible to configure an overload protection, to limit the amount of requests sent to the backing http server(s) to a given rate (z.B 100 Request per second ). If the threshold is exceeded requests should be answered with a default response. I am interested in requests per second not connections per second as a connection can have many requests. And yes to improve the servers is not an option here. If yes a configuration example to achieve that would be excellent. Thank you in advance.

    Read the article

  • Applications getting killed automatically

    - by nebi
    I am running httperf client on my m/c and after few seconds it is getting killed. dmesg shows: The command is: httperf --hog --client=0/1 --server=39.0.0.2 --port=80 --uri=/50kb --rate=20000 --send-buffer=4096 --recv-buffer=16384 --num-conns=6000000 --num-calls=1 Although I had done this test no. of times but never faced this error any time. From last two days I am observing this. My Ubuntu version is ubuntu 10.04. and httperf version is httperf-0.9.0 [ 2997.180620] Out of memory: kill process 7977 (apache2) score 70532 or a child [ 2997.180632] Killed process 7977 (apache2) [ 2997.184837] Out of memory: kill process 7971 (rsyslogd) score 8702 or a child [ 2997.184844] Killed process 7971 (rsyslogd) [ 2997.188823] Out of memory: kill process 7978 (apache2) score 1354 or a child [ 2997.188829] Killed process 7978 (apache2) [ 2997.192817] Out of memory: kill process 7973 (atd) score 561 or a child [ 2997.192822] Killed process 7973 (atd) [ 2997.196805] Out of memory: kill process 8102 (httperf) score 471 or a child [ 2997.196811] Killed process 8102 (httperf) Output of free command: total used free shared buffers cached Mem: 3862768 163000 3699768 0 2384 13068 -/+ buffers/cache: 147548 3715220 Swap: 3905528 0 3905528

    Read the article

  • For a particular domain, how can I cache its JSON responses locally?

    - by Chris
    I'm coding the frontend of a web app that uses XHR to grab JSON data from a 3rd party. The 3rd party service is slow and because of its API design, we need to make a LOT of API requests every time I refresh the page to test some new code. It's making the development loop painful. The requests are GETs, POSTs and PUTs even though I'm pretty sure none of the requests are changing state. I want to go to localhost for the JSON rather than to this 3rd party API - simply to make my development process faster.

    Read the article

  • MySql transfer / update (a bit specific)

    - by Jeff
    before posting I was digging whole site but didn't find help for my problem, so I hope someone will help... Facts: 30 Gb mysql database on remote server (about 20.000.000 rows) data are once weekly updated in local network (mysql) I need to transfer/replace local updated database with remote connection is about 2mb (real mb, not mbps) up/down Point is that I can't have 'down time' of remote mysql server. Until now I Tried: navicat data sync - Ok, but take about 3 days to finish dbForge - ok but need 5 days to finish mysql dump transfer to remote server and execution - about day, but a lot of downtime rsync folder with database /mysql/lib/MY_DATABASE - 4 hours, but after that I need to execute always 'repir on remote server' which takes about 2 hours, and a lot of down time mysql dump piped from cl to directly goto server - still now satisfied many problems I could give you more things that I tried... mysql replication - slow Anyase, what is best,best way to: refresh remote mysql on weekly level and in same time to have 0 sec down time nor huge server load If you have any idea please share

    Read the article

  • Multiple public/private key pairs for the same user

    - by bruceb
    First, sorry if this question has already been asked/answered - I've searched but perhaps I haven't recognised the answer.... What we have is a cluster of servers which need to access a single remote server using sftp. We are migrating from one remote server to another at the same (remote) location. We also want to refresh the public/private key pairs on the configuration as part of an ongoing security review. My question is - can we have multiple public/private key pairs for the same user between server A and server B? I want to do this to allow for cutover testing - but am concerned that the software checking keys may only try one of each type (rsa/dsa?) before rejecting the connection method and moving to the next type of key. Hope it's a straightforward question - please let me know if I need to supply more details. Thanks in advance Bruce

    Read the article

  • Hi, I want to write a python script to do some things in excel [closed]

    - by MEOWER
    I want to write a python script that will open up this one excel file, "refresh" on the Bloomberg tab (with the Bloomberg add-in), and export all the individual sheets as csv? How can I do this? What are the basic things I should know and is there any reference script that I can use? I'm using Excel 2010 with the bloomberg plugin. Not sure if this is the correct forum to use but pls move this to another forum if it's more appropriate there. Thanks.

    Read the article

  • Need to scale quickly. Which cloud service should I use?

    - by mk1000
    Traffic to my facebook app is growing at an insane rate and I need some suggestions on how to scale. I'm probably not going to even be able to keep it running by the day's end, as it's hosted from my already overloaded dedicated server. I need to either move it to its own box or a cloud service like e2c. Something like e2c seems like the way to go, but my server admin skills are terrible. Is there a good front end management UI for e2c or another hosting service that is comparable in cost that is fully managed? I don't mind going with something a bit more expensive now if that means I can get everything switched over and running within 24 hours.

    Read the article

  • Best monitor for reading

    - by wajed
    Will response rate make a difference? What is good brightness? What is a good contrast ratio? Definitely there are other things to look for, so please give me your opinion. Also, what screen size is good for reading? What size would you choose from 17-22? I'm thinking of getting one 17-19 for reading, and one 22 for movies. Or maybe 2 22" one vertical and one horizontal is better? I think I should look for lower native res., right?

    Read the article

  • Cannot access internal network on OSX 10.6.6

    - by cabuki
    Last week, I began having trouble connecting to our internal web servers. Usually, a refresh would take care of it or switching to a different wireless network, but as of yesterday, this wasn't enough. We have an internal DNS server using dnsmasq and a private internal host name (us.lcl). Once I started having more issues with the names not resolving, I tried pinging the server. Using the internal host name (s1.us.lcl), it failed. I tried using the IP address, but that also failed. I have no problems accessing external sites with the exception of it being a bit slower than normal. A reboot yesterday at lunch time after following the instructions here seemed to fix the issue, but when I came into the office this morning, it had stopped working. As of this posting, I cannot ping, ssh or access the web server using the internal host name or ip address. I'm the only one running 10.6 in my office and none of my colleagues has this issue.

    Read the article

< Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >