Search Results

Search found 11880 results on 476 pages for 'high traffic'.

Page 27/476 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • Broadcom BCM4313 wireless slow and high-latency

    - by Florin Andrei
    Ubuntu 12.10 64 bit on a Dell Latitude E6330 laptop. Wireless is pretty slow. It gets connected quick enough, but then it acts like a dialup connection. My ssh sessions over WiFi are slow and laggy. Even browsing is slow, the pages are loading like it's 1998. This does not depend on the access point, it's the same both at home and at work. Other systems work fine on these access points. I had an older Dell laptop before, different WiFi hardware, and it was much faster over the same wireless access points. Is this a known issue with this hardware? If so, any solutions?

    Read the article

  • High Usage of RAM by wxPython's GUI and need some advice to reduce it

    - by user67024
    I've recently developed a GUI in wxPython for windows platform. It contains a five tabs, 4 of them are just richTextCtrl boxes and the other one has controls for uploading files, buttons, textctrls, a slider etc.. As I was new to GUI development in Python, I used wxFormBuilder to generate some of the code using a good amount of sizers. So, now the problem is that the GUI starts off with a initial memory of around 40MB which is too much for such a simple application (Or so I think) . Also, when the functions handling the functions use huge lists as the program is for debugging large data logs and identifying the problems in'em implying that I can't afford memory for GUI. So, how can I reduce that start working memory size? Is it a general issue in wxPython? And currently trying use profilers but not sure if it's going to help.

    Read the article

  • High end mobile workstations with pointer stick

    - by Elijah Lynn
    I am looking for a list of higher end mobile workstations that run Ubuntu/Kubuntu well and also have a hardware pointer stick. Here's an illustration of one (from sciencesurvivalblog): I wouldn't mind getting a Macbook Pro and wiping it but they refuse to use pointer sticks and to me, they are extremely efficient. I see a lot of potential for Lenovo thinkpads as well. System 76 said they have no plans to implement a hardware pointer stick so that leaves them out as well. Any ideas?

    Read the article

  • Simple OpenGL program major slow down at high resolution

    - by Grieverheart
    I have created a small OpenGL 3.3 (Core) program using freeglut. The whole geometry is two boxes and one plane with some textures. I can move around like in an FPS and that's it. The problem is I face a big slow down of fps when I make my window large (i.e. above 1920x1080). I have monitors GPU usage when in full-screen and it shows GPU load of nearly 100% and Memory Controller load of ~85%. When at 600x600, these numbers are at about 45%, my CPU is also at full load. I use deferred rendering at the moment but even when forward rendering, the slow down was nearly as severe. I can't imagine my GPU is not powerful enough for something this simple when I play many games at 1080p (I have a GeForce GT 120M btw). Below are my shaders, First Pass #VS #version 330 core uniform mat4 ModelViewMatrix; uniform mat3 NormalMatrix; uniform mat4 MVPMatrix; uniform float scale; layout(location = 0) in vec3 in_Position; layout(location = 1) in vec3 in_Normal; layout(location = 2) in vec2 in_TexCoord; smooth out vec3 pass_Normal; smooth out vec3 pass_Position; smooth out vec2 TexCoord; void main(void){ pass_Position = (ModelViewMatrix * vec4(scale * in_Position, 1.0)).xyz; pass_Normal = NormalMatrix * in_Normal; TexCoord = in_TexCoord; gl_Position = MVPMatrix * vec4(scale * in_Position, 1.0); } #FS #version 330 core uniform sampler2D inSampler; smooth in vec3 pass_Normal; smooth in vec3 pass_Position; smooth in vec2 TexCoord; layout(location = 0) out vec3 outPosition; layout(location = 1) out vec3 outDiffuse; layout(location = 2) out vec3 outNormal; void main(void){ outPosition = pass_Position; outDiffuse = texture(inSampler, TexCoord).xyz; outNormal = pass_Normal; } Second Pass #VS #version 330 core uniform float scale; layout(location = 0) in vec3 in_Position; void main(void){ gl_Position = mat4(1.0) * vec4(scale * in_Position, 1.0); } #FS #version 330 core struct Light{ vec3 direction; }; uniform ivec2 ScreenSize; uniform Light light; uniform sampler2D PositionMap; uniform sampler2D ColorMap; uniform sampler2D NormalMap; out vec4 out_Color; vec2 CalcTexCoord(void){ return gl_FragCoord.xy / ScreenSize; } vec4 CalcLight(vec3 position, vec3 normal){ vec4 DiffuseColor = vec4(0.0); vec4 SpecularColor = vec4(0.0); vec3 light_Direction = -normalize(light.direction); float diffuse = max(0.0, dot(normal, light_Direction)); if(diffuse 0.0){ DiffuseColor = diffuse * vec4(1.0); vec3 camera_Direction = normalize(-position); vec3 half_vector = normalize(camera_Direction + light_Direction); float specular = max(0.0, dot(normal, half_vector)); float fspecular = pow(specular, 128.0); SpecularColor = fspecular * vec4(1.0); } return DiffuseColor + SpecularColor + vec4(0.1); } void main(void){ vec2 TexCoord = CalcTexCoord(); vec3 Position = texture(PositionMap, TexCoord).xyz; vec3 Color = texture(ColorMap, TexCoord).xyz; vec3 Normal = normalize(texture(NormalMap, TexCoord).xyz); out_Color = vec4(Color, 1.0) * CalcLight(Position, Normal); } Is it normal for the GPU to be used that much under the described circumstances? Is it due to poor performance of freeglut? I understand that the problem could be specific to my code, but I can't paste the whole code here, if you need more info, please tell me.

    Read the article

  • New host, high load?

    - by dotancohen
    A few minutes ago I signed up at a new webhost. I have yet to move my sites over. Upon initial SSH connection, I checked the load and memory usage, they do seem rather higher than I would like: # uptime 12:06:51 up 71 days, 23:23, 1 user, load average: 9.02, 9.49, 9.45 # free total used free shared buffers cached Mem: 33014800 31927192 1087608 0 2384812 17729816 -/+ buffers/cache: 11812564 21202236 Swap: 16787916 8584 16779332 Is that a bit to packed? I'm only paying about $5 USD per month, so I don't expect <0.1 loads, but ~10 is worrisome. Is it not? Also, there is no /etc/issue file so I tried other methods to guess the OS: # uname -a Linux box358.bluehost.com 2.6.32-20120131.55.1.bh6.x86_64 #1 SMP Tue Jan 31 15:43:27 EST 2012 x86_64 x86_64 x86_64 GNU/Linux # which yum /usr/bin/yum # which apt-get # That looks like CentOS / RHEL 6.2 possibly?

    Read the article

  • Boost Targeted Traffic and Online Sales Leads by Spring Cleaning Your Website

    Spring is here - so this is a great time to do some important housekeeping on your website. This type of house cleaning will help attract more visitors to your online home, which allows you to present your line of products and/or services to an increasingly larger targeted audience. Here are several key areas to focus on in your site cleaning: keyword strategy, web page content, on-page elements (besides copy) and off-page elements. This article shares easy-to-understand tips on exactly what to do - and how it will benefit your site!

    Read the article

  • Configure firewall (Shorewall/UFW) to allow traffic for services on an Ubuntu Server

    - by Niklas
    I have an Ubuntu Server 11.04 x64 which I want to secure. The server will be open to Internet and I want to be able to SSH/SFTP into the machine and the SSH-server runs on a custom set port. I also want a web server accessible from the Internet. These tasks seems not to hard to perform but I also want SAMBA-shares to be accessible from within the local network and this seems to be a bit trickier. If possible I also want to be able to "stealth" the ports necessary to protect the server further but also allow the SAMBA-shares to be automatically found within the local network. I've never configured firewalls before except for a router and I always bump into a bunch of problem when doing it all by myself so I was hoping for some tips or preferably a guide on how to this. Thank you! Update: On second thought I'd could just as likely go with UFW if the same settings are achievable ("stealth" ports).

    Read the article

  • Website Optimization For Maximum Traffic

    All kinds of advertisement have grown into an extremely major venue throughout the entire internet. Nearly all companies have placed ads on the internet. But with tons of web sites being viewed and s... [Author: Frank Breinling - Web Design and Development - June 08, 2010]

    Read the article

  • Very High CPU usage (100%) from just browsing the Web

    - by cole
    I tested on Firefox and Chromioum. Im at 100% while loading pages which causes them to load slow and when I dont have a application running Im at 40% CPU (At least) Everything is slow basically. Im also already on Ubuntu Classic so im not using Unity. Should I go to 10.04? is that more stable? On windows this wasnt an issue. I have a Dual Boot with XP and a 2.4Ghz Intel Celeron with 768MB RAM and an Nvidia 6200 Graphics card. I heard 10.04 was the most stable. any suggestions?

    Read the article

  • Tracking traffic and/or referrals from iPad applications

    - by kayaker243
    In Google Analytics, there is extensive information on the mobile device, version and browser version. However, this doesn't seem to go beyond the mobile browser. I would like to determine which application is responsible for visits to my site. Specifically, I want to know how many visits are coming from zite. http://www.handsetdetection.com/properties/vendormodel/Apple/iPad/page:4 seems to indicate this information is probably available, where/does Google Analytics expose this?

    Read the article

  • High resolution CLI?

    - by Mike Williamson
    I want the resolution of my console to match my screen resolution(1440x900). 1024x768 works fine but for some reason when I put 1440x900 when I switch to ttyX the command prompt is almost right off the bottom of the screen! The Ubuntu splash screen goes off the edge of the screen during boot as well. Here is my /etc/default/grub 4 GRUB_DEFAULT=0 5 GRUB_HIDDEN_TIMEOUT=0 6 GRUB_HIDDEN_TIMEOUT_QUIET=true 7 GRUB_TIMEOUT=10 8 GRUB_DISTRIBUTOR=`lsb_release -i -s 2> /dev/null || echo Debian` 9 GRUB_CMDLINE_LINUX_DEFAULT="quiet splash" 10 GRUB_CMDLINE_LINUX="" 11 GRUB_GFXMODE=1440x900 12 GRUB_GFXPAYLOAD_LINUX=keep How do I get my CLI resolution to be 1440x900?

    Read the article

  • SEO As the Online Business Traffic Driver

    The SEO means that the online business marketer can drive his site or blog to the top places at the search engine result pages with the selected keywords. This requires, that a marketer knows the search engine algorithm, i.e. according to which criteria the engine will put the sites in a certain order.

    Read the article

  • High I/O wait after login

    - by Jackson Tan
    I've noticed that the ubuntuone-syncdaemon hogs up the hard disk every time I log in to Ubuntu (10.04). This takes up to two or three minutes, which makes Ubuntu insufferably slow. Opening Firefox is okay, but the browser is constantly greyed out and lags horribly. Given that I often shut down my laptop when I don't use it (about 3 to 4 times a day), this makes Ubuntu lose much of its lustre because of its long boot time. Is this a normal behaviour of Ubuntu One? Is it intended? Note that I've actually posted this in the forums here, but I received only few replies.

    Read the article

  • Bug once in a while,but high priority

    - by Shirish11
    I am working on a CNC (computer numerical control) project which cuts shapes into metal with help of laser. Now my problem is once in a while (1-2 times in 20 odd days) the cutting goes wrong or not according to what is set. But this causes loss so the client is not very happy about it. I tried to find out the the cause of it by Including log files Debugging Repeating the same environment. But it wont repeat. A pause and continue operation will again make it to run smoothly with the bug reappearing. How do I tackle this issue? Should I state it as a Hardware Problem?

    Read the article

  • Learn How to Create High-Converting Landing Pages - Part 4

    Once you have placed call to action through your pay per click ads on the landing pages, you have to make sure that as part of your SEO plan, you must include paid keyword several times in your landing pages which will lead to well optimized pages under your search engine marketing efforts. As a smart SEO expert, it is important for you to include the keywords in a manner that it cannot be skipped by your readers because if it is missed, it will affect your PPC campaign management and search engine optimization results.

    Read the article

  • SEO title tag and earning a high rank on search engines [closed]

    - by Josh White
    Possible Duplicate: What are the best ways to increase your site's position in Google? One of the most basic SEO techiniques is including accurate description below 64 characters in the tags of each page. I was wondering if is considered ethical SEO to set up the contents based on a search keyword for example. So if the user searches for 'apples pictures' for example, then the title of the webpage would be 'apple pictures'. Note that the search keywords accurately describe my website contents because the title will always relate to the body of the webpage and 85-90% of the terms searched for will return corresponding results. Is this considered a good seo practice and is it ethical? Also, can someone explain what the idea is behind "linking"? I read somewhere that it is a good seo practice to link other websites and it is good when other websites link you. Does this mean that I should include as many links to other websites as possible (that are somehow relevant to my websites goal), also if I joined forums/services and posted my website url in the signature, would that still be considered other websites linking me?

    Read the article

  • High level overview of Visual Studio Extensibility APIs

    - by Daniel Cazzulino
    If your head is dizzy with the myriad VS services and APIs, from EnvDTE to Shell.Interop, this should clarify a couple things. First a bit of background: APIs on EnvDTE (DTE for short, since that’s the entry point service you request from the environment) was originally an API intended to be used by macros. It’s also called the automation API. Most of the time, this is a simplified API that is easier to work with, but which doesn’t expose 100% of what VS is capable of doing. It’s also kind of the “rookie” way of doing VS extensibility (VSX for short), since most hardcore VSX devs sooner or later realize that they need to make the leap to the “serious” APIs. The “real” VSX APIs virtually always start with IVs, make heavy use of uint, ref/out parameters and HResults. These are the APIs that have been evolving for years and years, and there is a lot of COM baggage. ...Read full article

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >