Search Results

Search found 4721 results on 189 pages for 'traffic'.

Page 40/189 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Why do servers go down after a lot of traffic?

    - by mohabitar
    I'm working on an iOS app that makes extensive use of databases, where users will be able to sync their data to a server. However, I'm terrified of the event that if too many users start using the app, the servers will no longer be able to handle it. I'm not a server guy at all and am not too familiar with how that works, but my question is, why do servers get overloaded and how can that be prevented? Does it have to do with who my server host is? Or is it about the efficiency of my code? If my host is a reliable server, such as Amazon AWS, am I still at risk for server problems? Bottom line is, does it have to do with the way I implement my code, or does it have to do with who my host is?

    Read the article

  • Can I replace a router and DHCP server without disrupting traffic?

    - by SRobertJames
    I have a device which acts as a router and DHCP server. I'm replacing it, and would like to minimize down time. If I unplug it, and plug in a different device with the same IP, will all the PCs with DHCP leases keep on working? (I have DHCP Conflict detect on, so it shouldn't reassign a DHCP address already used). What if I want to change the IP (new subnet) - is there anyway to tell all the clients (Windows PCs) to release their DHCP leases and request new ones in a minute? If before unplugging the old device, I have it release all DHCP leases, will the Windows PCs automatically ask for new DHCP addresses?

    Read the article

  • Using a nat rule to translate 80/443 traffic to web server, but internal users cannot access it using external ip/domain name

    - by Josh
    I am using Cisco ASDM for ASA I have my internal network called soa. My outside interface is called outside. Let's say my outside IP given to me by my ISP isp is y.y.y.y I have a web server inside my network with a static ip of x.x.x.110. I have configured 2 static nat rules (one for http the other for https). Source is x.x.x.110. Interface is outside, service (http or https). Maybe I am doing this wrong, but when I run the packet tracer, I choose outside interface and for the source IP I used 8.8.8.8 and the destination ip is my outside IP address, y.y.y.y When I run that, it shows the packet traversing successfully, using 9 steps. For my other test, I switch to the soa interface, input an ip on that network, and leave the destination the same. This test comes up with 2 steps and then fails on my access list. When I see the rule that fails, it is my catch all which is source: any desitnation: any, service: ip action: deny. What rule do I need to make to allow my soa network access to go out and come back in by my external IP addess (using a domain name attached to that ip in my dns, of course)?

    Read the article

  • Our company claims that the DLP system can even monitor the contents of HTTPS traffic, how is this possible?

    - by Ryan
    There is software installed on all client machines for DLP (Data Loss Prevention) and HIPAA compliance. Supposedly it can read HTTPS data clearly. I always thought that between the browser and the server, this was encrypted entirely. How can software sneak in and grab this data from the browser prior to it is encrypted or after it is decrypted? I am just curious as to how this could be possible. I would think that a browser wouldn't be considered very secure if this was possible.

    Read the article

  • How can I monitor network traffic in an all Mac home network?

    - by raiglstorfer
    I have an all Mac network consisting of an Airport Extreme, 1 MacPro, 1 Mac Mini, 2 MackBook Pros, 2 iPads, and 2 iPhones. The Mac Pro is connected directly to the Airport Extreme via Cat5 and the rest is all running via Wireless. Lately I've been getting prompted by Google to enter Capchas frequently. The message states that I might have software running on my network I'm not aware of. My wireless router is password protected using WPA2 Personal and I frequently change my password so I don't think someone is using the network from outside (but I've no way to confirm this). I'm looking for a relatively cheap (preferably open source) solution that would enable me to monitor and profile the network usage by machine and port. Can someone recommend a solution?

    Read the article

  • How can I monitor network traffic in an all Mac home network?

    - by raiglstorfer
    I have an all Mac network consisting of an Airport Extreme, 1 MacPro, 1 Mac Mini, 2 MackBook Pros, 2 iPads, and 2 iPhones. The Mac Pro is connected directly to the Airport Extreme via Cat5 and the rest is all running via Wireless. Lately I've been getting prompted by Google to enter Capchas frequently. The message states that I might have software running on my network I'm not aware of. My wireless router is password protected using WPA2 Personal and I frequently change my password so I don't think someone is using the network from outside (but I've no way to confirm this). I'm looking for a relatively cheap (preferably open source) solution that would enable me to monitor and profile the network usage by machine and port. Can someone recommend a solution?

    Read the article

  • Why does my router log crazy amounts of blocked traffic on port 1701?

    - by Vlad Seghete
    I have a 2701HGV-B 2Wire modem and router (AT&T). The log is basically full with entries similar to the following with a time between a fifth and a third of a second between entries: src=86.156.7.170 dst=xxx.xxx.xxx.38 ipprot=17 sport=6882 dport=1701 Unknown inbound session stopped src=58.176.22.252 dst=xxx.xxx.xxx.38 ipprot=17 sport=21573 dport=1701 Unknown inbound session stopped src=91.221.6.250 dst=xxx.xxx.xxx.38 ipprot=17 sport=25902 dport=1701 Unknown inbound session stopped ... where the source IP will be different for every entry. The entries accumulate constantly, every single second that the router is on several of them appear in the log. The destination is the WAN address for my router. I understand that this is somehow related to VNCs, but I don't know enough to figure out why my router is getting bombarded with requests for a VNC session. Is there anything fishy going on or is this normal? If it is normal, how do I keep these entries from spamming my log files? Since there's about two or three of them every second, everything else gets drowned out.

    Read the article

  • How to backup Servers to an SSH-Host with low traffic and access to versions and encryption?

    - by leto
    Hello, I've not run backups for the past dont't remember anymore years for my personal stuff until waking up lately and realising contrary to my prior belief: Actually. I care! :) Now I have a central data server at home where I want to attach an external media to, to which I want to save backups of my most important stuff, like years of self-written scripts, database dumps, you name it. I've tinkered with rsync+ssh over the last two years, also tried tar over ssh, but don't know the simplest and most easy to maintain way to do it yet. Heres my workload: A typical LAMP-Server (<5GB Data) which I'd like to backup fully so lots of small files connected via 10Mbit My personal stuff (<750GB Data) from a Mac connected via GE My passwords in an encrypted container (100Mb) from OpenBSD connected via serial-PPP My E-Mail from the last ten years (<25GB) as Maildir which I need to keep in readable format Some archives (tar.*) which I need to backup only once and keep in readable format (Deleted my ideas, as I'm here for suggestions) What I need: 1. Use an ssh-tunnel for data transfer 2. Be quick with lots of small files 3. Keep revisions 4. Be sure the data I save is not corrupted 5. Intelligent resume functions and be able to deal with network congestion :) 6. Compressed and optionally encrypted storage 7. Be able to extract data from backup easily (filesystem like usage would be nice) How would and with what software would you backup this stuff? Hints to tools that can help solve only part of my problem (like encryption) also greatly appreciated. Greets

    Read the article

  • How is network traffic routed (VPN vs. 'open')?

    - by craibuc
    If my workstation has a VPN connection to a given network (a client's, for instance) and an open connection, what determines how a request for a network resource (e.g. a web page) is routed? Moreover, given that a resource (e.g. google.com) could be available via either route (i.e. VPN or non-VPN), how is this route determined? Is there a way to 'force' the routing to use a given route or the route with lower overhead?

    Read the article

  • How can I make the Windows VPN route selective traffic (by destination network)?

    - by Legooolas
    I want to use a windows VPN but only for a particular network, so that it doesn't take over my entire network connection. e.g., Instead of the VPN becoming the default route, make it only the route for 192.168.123.0/24 (I can see that there is a solution for this for Ubuntu in this question, but sometimes I have to do this on Windows too) Can this be automated so that whenever I connect to the VPN it does this?

    Read the article

  • How to track subdomains with Google Analytics while having mod_rewrite redirect to a subdomain?

    - by Marek
    When users come directly to domain.com or www.domain.com, I am redirecting them to shop.domain.com via this .htaccess rewrite: RewriteEngine on RewriteCond %{HTTP_HOST} ^www.domain.com$ [OR] RewriteCond %{HTTP_HOST} ^domain.com$ RewriteRule ^(.*)$ http://shop.domain.com/ [R=301,L] The content served by shop.domain.com has the following tracking code parameters: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-123456-6']); _gaq.push(['_setDomainName', '.domain.com']); _gaq.push(['_trackPageview']); All direct visits that come to shop.domain.com as a result of the rewrite from domain.com are tracked as referral traffic, showing my own domain.com as referral source in Google Amalytics. I would like to track these visits as direct traffic. How to change the configuration to track mod_rewritten traffic on my subdomain coming from my own domain as direct traffic?

    Read the article

  • Routing only some local IPs through VPN on dd-wrt

    - by bo-inge-ostberg
    Much similar to this entry: http://serverfault.com/questions/94283/using-dd-wrt-to-connect-to-vpn-and-forward-all-traffic-of-certain-devices-through , I have set up my router with dd-wrt + OpenVPN to connect to a VPN. This works fine, and all traffic from behind the router goes through the VPN. How do I route(?) traffic in the router so that only certain IPs from the LAN will go through the VPN, while the others take the "normal" route? Is it also possible to allow traffic from certain local IPs to go ONLY through the VPN, making it impossible for them to use the regular internet connection if the VPN is down? I know this question was answered in the post I linked to, but that just doesn't seem to work for me. The routing table and rules change, but traffic still just goes through the VPN.

    Read the article

  • Ubuntu and VirtualBox

    - by Sinan
    I have the following configuration, A host running Windows 7; A guest running Ubuntu 14.04 LTS (VirtualBox); I am connecting a Cisco router directly to my PC running Windows 7 and testing the router for netflow packets in the virtualBox I am having a difficulty capturing the traffic of the netflow from the Cisco device in my virtualBox using port 2222. I tried to use the many different networking modes provided by virutalBox (i.e. NAT, Bridged Adapter, Host only adapter) but I am not successful in capturing the netflow traffic. Could you please advise me on the configuration setup that need to be done on the virtual box to allow capturing the traffic coming from the router. I successfully capture the netflow traffic on my PC (windows 7). Thank you

    Read the article

  • Using Url Rewrite to Block Page Requests

    - by The Official Microsoft IIS Site
    The other day I was checking the traffic stats for my WordPress blog to see which of my posts were the most popular. I was a little concerned to see that wp-login.php was in the Top 5 total requests almost every month. Since I’m the only author on my blog my logins could not possibly account for the traffic hitting that page. The only explanation could be that the additional traffic was coming from automated hacking attempts. Any server administrator concerned about security knows that “ footprinting...(read more)

    Read the article

  • Google Adsense threshold estimation

    - by Wladimir Ivanov
    I've an electronic music blog with traffic mainly from the North American continent, Western Europe and Russia. Daily I get about 100 unique visitors with 150-200 pageviews. Should I start Adsense or I need to work to increase the traffic stats. Can you suggest another appropriate monetizing option for the given case? How much time It would take me to hit the 100$ Adsense barrier with the given traffic statistics? Thanks in advance.

    Read the article

  • Keyword Optimization - 3 Tips to Drastically Improve Search Engine Performance For Free

    Keywords for anyone are the backbone to traffic, it doesn't matter where you place links, how you write articles, and how you build your pages keywords play a huge part in getting traffic to your website (and blog). More and more I receive emails regarding some new internet marketing "free traffic source" of the "best new back linking method" and all these are good and may indeed work in some situations they are somewhat redundant in others.

    Read the article

  • How to check that I have recovered from Penguin 2.0?

    - by Simon Walker
    I have 3 year old website which has been hit by Penguin 2.0 in May. The website traffic dropped almost 30%. I have been working hard from last 2.5 months on the website and my website's traffic recovered in last week of August. In fact, I am receiving more traffic then ever. When I look at the stats, I find my website's search engine visibility has been improved. It is now appearing for more search queries. My website's impressions have also increased. What I am worried about is that my website is nowhere in top 5 pages for keywords having high competition and carrying the highest search volume. They are few in number but important. Should I consider my current situation as recovery or it's just the partial recovery? If it is only partial, then how come traffic is more then it was before penguin 2.0?

    Read the article

  • Why You Should Use SEO

    Search engine optimization (SEO) is a term for optimizing your website so that search engines will be able to find you faster, and also give you a higher ranking. A higher ranking will mean that they will send you more free traffic. And believe it or not, the traffic which the search engines send you, is by far the best traffic which you could hope for.

    Read the article

  • How to get experience in large scale databases?

    - by Justin
    I have written applications that are very small scale and the code I write works fine for them. But I have often wondered how the server side code I write would scale up from 100s of queries per day to millions. Also when looking at possible jobs/projects, people are often looking for developers with experience in this sort of high traffic database design so I would at least like to be able to say, I havent gotten to work on a project that was this popular, but I at least have tried to simulate it. Are there tools or frameworks that can generate a lot of traffic or at least simulate what would happen with traffic on different orders of magnitude so I could get some practice writing optimized code for higher traffic applicaitons?

    Read the article

  • Amazon EC2 vs Dedicated server at Hetzner, what's the use for EC2?

    - by C-Blu
    After searching the web I still can't find the reason to use EC2. What's the point to scale EC2? If you expect a huge burst in traffic, they say. OK, but what if you already have a couple of sites with good traffic, and for example medium reserved EC2 instance is not enough. You are paying $36.60(medium reserved for 1year) in EU(Ireland) + traffic + optional expenses for databases and S3 if you use them. Of course as some point when you are under $56.6-$66.1 you can optimize your hosting costs with Amazon EC2. But when you get at some point if purchase EX4 server from Hetzner, it will surpass your perfomance needs for a long time, before you get a massive traffic. (I am wrong?) CPU: i7-2600 Quadcore (3.4-3.8 Ghz) RAM: 16 GB HDD: 2x3 TB SATA (6 Gbit/s) - I think that disc performance of a dedicated is better then of Amazon EBS Traffic: 10 TiB in month included. This is what you get from Hetzner for $56(- 19% VAT) or $66 for EU residents. Please, tell me what's the reason to use Amazon? Which load won't a server from Hetzner take, but Amazon Auto Scaling will? The maintenance of dedicated vs EC2 is still the same? Or hardware failure at Amazon, won't ruin your EBS storage? I'm still not at the level when I need expensive hosting, but want to know beforehand, just to be sure if Amazon infrastructure is better then pure performance of Hetzner's hardware.

    Read the article

  • Things to Take Note of When Writing Directory Submissions

    Directory submissions though past their glory are still a highly regarded form of getting traffic onto websites. There are a lot of people who still frequent directories and use search engines. A high placement on search engine directories not only increases the quantity of traffic but also appends to the quality of traffic. In some ways directory submissions depend a lot on your accuracy of descriptions.

    Read the article

  • A Fast and FUN Way to Google First Page Formula

    Internet traffic is a number game, more traffic more money and you can get traffic from Google for absolutely free. At zero cost to you but with some knowledgeable effort that will put you ahead of the competition. The actionable strategies discussed on this article will certainly but help you to rank first page on Google.

    Read the article

  • Boosting That Return Rate

    I've noticed something about most beginner Internet marketers. They get really excited when they get traffic - even if that traffic doesn't convert to anything in the way of sales or proceeds. They're over the moon about the chance to make a profit. That's where they go wrong, unfortunately. It's fun to be excited at first, but after a week or so, reality really needs to set in and you need to start making decisions based on factors like conversions and ROI rather than raw traffic.

    Read the article

  • solve TOR edge node problem by using .onion proxy?

    - by rd.
    I would like to improve the TOR network, where the exit nodes are a vulnerability to concealing traffic. From my understanding, traffic to .onion sites are not decrypted by exit nodes, so therefore - in theory - a .onion site web proxy could be used to further anonymize traffic. Yes/no? perhaps you have insight into the coding and routing behind these concepts to elaborate on why this is a good/not good idea.

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >