Search Results

Search found 11516 results on 461 pages for 'mobile views'.

Page 385/461 | < Previous Page | 381 382 383 384 385 386 387 388 389 390 391 392  | Next Page >

  • Load is 0, yet site crawls (sometimes). What gives?

    - by Yegor
    I have a ~1.5-2mil page views per day site running on 2 servers. One for mysql, other for everything else. Mysql box has a load of 3, frontend is usually 0.0-0.1. Both are dual quad core with 8GB ram running SAS drives in raid5. CPU is idle for majority of the time, iowait is non-existent. Im running nginx, memcache, and site is built on php. Half the time everything runs perfect, while at other times it lags something severe, when it takes 10-15 seconds for a page to load. Page execution time is always super low, but it seems to hang, waiting for something before it actually loads the page. Whats even more weird is that it only happens to 1 file on the site (but its the one thats most commonly accessed, that actually loads the content on the site). Other pages are super fast at all times, even when it takes 15 seconds to load actual content. I have nginx_stats plugin installed, and if I monitor it, the lag spikes happen when the write column starts going above 100, and it frequently does... all the way to 500-1000. It does so at totally random times... not when traffic is heavy... it can do this in the middle of the night, and work perfectly at 5pm when traffic is at its highest. Any ideas?

    Read the article

  • Why does the screen resolution of 1440x900 suddenly disappears from Intel GMA Control Panel?

    - by GeneQ
    I'm using a Vostro 1200 laptop with the Mobile Intel(R) 965 Express Chipset powering its graphics and running Vista 32-bit SP2 . I've been using the Vostro with a Dell SE198WFP LCD Monitor as the external display since day one for about two years without any problems. Recently, I plugged the Vostro into a couple of other monitors. The problem is, now the native resolution for my main monitor's (the SE198WFP) resolution of 1440x900 @ 60 Hz is no longer available. (See below) I've tried everything from uninstalling and reinstalling the Intel drivers as well as the monitor drivers to no avail. I've Goggled that this problem and it appears that this has happened to other people but all the answers involve people giving up in frustration or reinstalling; both terrible outcomes. Has anybody ever figured why this happens and have a good solution? Thanks. UPDATE: This dude has a complicated solution, which I haven't tried yet. His explanations for the problem was After an exausting search for an answer to the matter of why my brand new 19? widescreen monitor’s native resolution (1440×900) was unavailible (sic) in the display properties, I finally stumbled upon an article a person posted on Intel’s forums that basically explained what shannanigans Intel had been up to with their GMA 950 line of onboard graphic solutions. Not very comforting.

    Read the article

  • Servers / ram for social network- how many?

    - by Marty
    I am launching my social network soon an looking into hosting. The question i am lost is: Do i need separate servers for web vs database vs image handling since there is photo sharing? Or does 1 server handle it all? Also is more ram better? If i get 50GB ram is that better than having 8 gb ram? EDIT: It is PHP codeignitor and MySQL for now. (switch to NoSQL DB later if demand calls fr it.) I will be using memcache also. Concept wise it is similar to yelp, so geographic based with lots of user content and image sharing + live feeds an privacy levels. User plan is open question. Without testing the demand for this i cant give a number. But the concept is unique, no one out there with the set of features i am releasing so it could grow. Ideally I want to plan for handling about 1-2 million views / month from launch. If it goes more than that then I will upgrade.

    Read the article

  • Java: very slow tomcat and too big war file

    - by NaN
    I created some sort of RESTful API backend for a mobile app. It's written completely in Java using Jersey as Framework. At the moment no database is used, it's all in the memory, but this is no problem so far (it's only for prototyping purposes). I ordered the smallest package from digital ocean and installed tomcat7. All in all tomcat works, but I have three major problems: 1) It takes a long time until tomcat deploys the app: I deploy it per tomcat manager and it takes about 2 minutes unit the site works (excl. war upload time). 2) The war files are quite big (16MB): I don't know why they are so big. There are no database dependencies and most logic is written in plain java. Okay, we are using jersey, but 16MB are a lot for the logic of a small webservice. 3) I have to restart tomcat all 3 days or so. It looks like a memory leak or something similar. If the app runs for a few days the response time is quite high and the server seems to be frozen. It works again, if I restart tomcat per ssh. You can find my mvn pom file right here. Do you have some tips? Are there good tomcat alternatives?

    Read the article

  • Server configration for our website [duplicate]

    - by Varun Varunesh
    This question already has an answer here: Can you help me with my capacity planning? 2 answers We are a start-up and 6 month back we have launched our beta version website. Now we are in a phase of building our website and web-services for the final product. This website will be based on PHP, Python, MySql database and with wamp server. Right now in the beta version we are using Azure VM for hosting, with configuration of 786MB RAM and Shared CPU. We have 200 avg users daily coming to our website. Now we are trying to increase the number of users from 200 to 1500 daily users. And I am thinking our server should have capability to handle at least 100 concurrent user. Also we have developed web-services for our mobile-apps. Which can also increase loads on the sever. So here are the question that takes me here, I am pretty much confused about whether to go with shared hosting or VM based hosting. If VM, then what configuration will be best for our requirement (as I discussed above) ? Currently our VM is a Windows based server and its very simple to manage, So other than cost factor why should I go for Linux based sever? What other factor should I keep in mind while choosing the server as per our requirement ?

    Read the article

  • Alternative Bluetooth Stacks for Windows 7 64bit

    - by Martin
    I have a notebook with an inbuilt Broadcom BCM2046 bluetooth adapter and several bluetooth HID-devices (mice, keyboards etc.) The operating system is Windows 7 64 bit Professional. The HID-devices all work perfectly with other computers, but on the system mentionend above, problems with some power-saving features inside the HID-devices occur (see eg. Amazon reviews for Microsoft Mobile Keyboard 6000 not waking up). I have tried the bluetooth drivers supplied by Windows update and the latest Broadcom drivers directly from the Broadcom updater software. The problems persist (I can rule out any further configuration issues or alternative device drivers, I have tried every possibility). I have tried a trial version of the BlueSoleil Bluetooth stack and it solved the wake-up problem. However the BlueSoleil stack causes some other problems, is relatively expensive and I would prefer not to use it. My question: are there any other alternative bluetooth stacks availible for Windows 7 64bit? To my knowledge there used to be Toshiba Bluetooth stack for non-Toshiba hardware, but the older versions I have found on the internet do not install, they do not seem to recognize the bluetooth hardware when installing the driver.

    Read the article

  • USB Harddisk not working on dual boot windows7/8

    - by Jesper
    Yesterday I installed Windows 8 on a machine that already had Windows 7. They are on dual boot and both systems work fine. The problem is that inserting a USB hard disk in either system does nothing. If I connect a USB mouse or mobile phone, they work fine, so the USB plugs are active/working and the USB hard drives that I am trying to connect work on my other laptop just fine. I have tried to uninstall all USB-related items in Device Manager and let them reinstall upon restart, but that didn't help. The USB drive does not show up in disk management either. The strange thing is that it is exactly the same situation on both windows. USB mice etc. work just fine and USB hard drives do not. Any ideas on solving this problem would be great. ...Don't know if it is important, but this is a Toshiba Tecra R950 Laptop. EDIT: I have found out that my other USB HD (Western Digital) works on this laptop, but for my StoreJet Transcend and Adata "something" does not work. All three work on another Windows 7 laptop. Sizewise the WD is in the middle at 400 GB. The StoreJet is 640 GB and the Adata is 200 GB.

    Read the article

  • Ubuntu network card problem.

    - by Steve Greene
    Hello folks, Several days ago, I installed Ubuntu 9.10 onto my Acer Aspire 3100 laptop, running it alongside Widows Vista as a dual-bootable system. Creation of the Ubuntu boot CD went fine, and the installation onto my hard drive was flawless. Ubuntu opens and behaves as I would expect, except for one little problem. For reasons unknown to me, Ubuntu is not communicating with my laptop's networking hardware, and I have no internet connectivity, it works fine under Windows Vista. Up in the right side of the Ubuntu desktop, I click on the network icon and it does not show a wireless connection at all. At home, where I use a dialup modem, I also see no means of getting online. My modem is an HDAUDIO Soft Data Fax Modem with Smart CP,manufactured by CXT (Conexant Systems Inc., file version 4.0.13.0, and the driver version is 7.58.0.0). I am an advanced computer user, but I am not a programmer. I seek a solution that is user-friendly for normal people, something equivalent to a driver that I can easily install or activate that will allow Ubuntu to see my hardware and get me connected. Can anyone help me over this hopefully-little glitch My processor is a Mobile AMD Sempron Processor 3500+ at 1.80 GHz, 1.50 GB RAM, and a 32-bit Operating System.

    Read the article

  • Custom Domain for Google App Engine and Google Apps

    - by Kevin
    I have set up and configured Google App Engine and Google Apps to use my custom domain with a cname 'www'. I have configured my DNS (via fasthosts.co.uk) with the cname and pointed it to ghs.google.com. I can access the website using the google app engine domain at capel-y-crwys.appspot.com but I can't access it via my custom domain www.capelycrwys.org.uk. I have allowed several days for propagation of the DNS etc. The really strange this is I can access the app via my custom domain when I use the web browser on my Android mobile phone. I can't access the app via my custom domain from my home internet connection, my work internet connection or a friends internet connection. I tried a few online web proxies and I can access the app via the custom domain. I posted this question on the google forums code.google.com/appengine/forum/?place=topic%2Fgoogle-appengine%2FfUP-G_0FKE4%2Fdiscussion and a commentor has said he could access the app via the custom domain. So why can't I access it direct via my home internet connection etc? I've tried loads of google searching and even found a similar sounding post here on serverfault serverfault.com/questions/208461/custom-domain-name-server-not-found-google-app-engine-and-google-apps but it doesn't have an answer that helps me.

    Read the article

  • How can I change the binding order of network adapters in Windows 7?

    - by Chris Farmer
    The end goal here is that I am trying to install an Oracle 10g server on my Windows 7 x64 dev box. I use DHCP, and the Oracle installer is throwing up this warning: Checking Network Configuration requirements ... Check complete. The overall result of this check is: Failed <<<< Problem: The install has detected that the primary IP address of the system is DHCP-assigned. Recommendation: Oracle supports installations on systems with DHCP-assigned IP addresses; However, before you can do this, you must configure the Microsoft LoopBack Adapter to be the primary network adapter on the system. See the Installation Guide for more details on installing the software on systems configured with DHCP. I have installed the loopback adapter, but I am not sure how to make it the primary network adapter. I see this Microsoft KB article on the subject but it's Windows XP-oriented, and I can't seem to find a comparable one for Windows 7. Some of the options it talks about don't seem to be present in the views of the adapters that I see. So, how can I make the loopback adapter become the primary adapter?

    Read the article

  • Painless deployment of a Django app (port from Drupal). Do I have to switch to a VPS?

    - by Monden
    I'm about to complete porting my Drupal based community site to Django. My Drupal site is hosted at a shared hosting (Dreamhost) for last 4 years, and stability & performance has been satisfactory. The site gets around 5k unique visitors with 70-80k page views a day. This will be my first deployment of a Django application and I'm not comfortable with managing my own VPS. I use Ubuntu as a dev. server, but I don't have experience with it at the production env. I have an unrelated internal CRM app (Django) that I host with Webfaction. However security and performance isn't an issue as it's only accessed by 5 people. Unfortunately, I don't have much time to learn and maintain a VPS at this moment. I would like to know if I can host a site with this much traffic at Webfaction's shared environment? How would performance differ in comparison to Linode or Slicehost? Google AppEngine isn't an option at the moment as I'll be using my current Postgresql database.

    Read the article

  • Offline productivity

    - by Frank Meulenaar
    On some days I'm commuting 2hs (oneway) in the train. I don't have any mobile internet nor is there always WiFi service in the train. Because of security reasons I can't do any work in the train so I'm trying to work on my geek time. I'm looking for general solutions on how to do this (I'm on FireFox/Windows but I don't think it matters) Email works perfectly with gmail offline. It syncs directly when online and remembers complicated stuff. So far I used the ScrapBook plugin to store an website. It works good, but I have to download my favorite news page every day again - I want it to sync as soon as possible. It would even be more awesome if I could click a page on my desktop and my laptop would sync as soon as it has the chance. (edit: maybe the autosave plugin for scrapbook can do this) Similarily, I use the Downloadhelper plugin to download youtube vids, but I'd like something that automatically downloads videos from a given channel. Any tips are welcome. So far my early morning schedule is: wake up, power on laptop, make coffee, power off laptop and leave within 10 minutes (enough time for GMail to sync) but I can imagine a system where my laptop stays on during the night (or boots before I wake (and makes me coffee :])).

    Read the article

  • What is the correct authentication mechanism when there are users inside and outside the domain?

    - by Gary Barrett
    We have a Windows 7 enterprise desktop data entry app for mobile (laptop) users with local SQL Express 2008 R2 Express db that syncs data with an SQL Server 2008 R2 Server db. Authentication is required before syncing the data. The existing group of users are part of the organisation's domain so normal scenario and they connect to the Sql Server directly. But there are plans for a second group of app users who belong to various partner organisations so they are outside our domain and have their own various separate domains/accounts. The aim is to deploy the desktop app to them and they will periodically sync data to our SQL Server. What I am uncertain of: Is it possible to authenticate users from another domain? Can permissions be managed via Active Directory etc? Which authentication protocol should be used in this scenario? Windows, Forms, SQL, etc? The IT people are requesting users of the system be managed via Active Directory. Is it possible to manage the external domain users access via Active Directory?

    Read the article

  • How do I keep a table in Sync across 4 db's to be used in SQL Replication Filtering?

    - by Refracted Paladin
    I have a Win Form, Data Entry, application that uses 4 seperate Data Bases. This is an occasionally connected app that uses Merge Replication (SQL 2005) to stay in Sync. This is working just fine. The next hurdle I am trying to tackle is adding Filters to my Publications. Right now we are replicating 70mbs, compressed, to each of our 150 subscribers when, truthfully, they only need a tiny fraction of that. Using Filters I am able to accomplish this(see code below) but I had to make a mapping table in order to do so. This mapping table consists of 3 columns. A PrimaryID(Guid), WorkerName(varchar), and ClientID(int). The problem is I need this table present in all FOUR Databases in order to use it for the filter since, to my knowledge, views or cross-db query's are not allowed in a Filter Statement. What are my options? Seems like I would set it up to be maintained in 1 Database and then use Triggers to keep it updated in the other 3 Databases. In order to be a part of the Filter I have to include that table in the Replication Set so how do I flag it appropriately. Is there a better way, altogether? SELECT <published_columns> FROM [dbo].[tblPlan] WHERE [ClientID] IN (select ClientID from [dbo].[tblWorkerOwnership] where WorkerID = SUSER_SNAME()) Which allows you to chain together Filters, this next one is below the first one so it only pulls from the first's Filtered Set. SELECT <published_columns> FROM [dbo].[tblPlan] INNER JOIN [dbo].[tblHealthAssessmentReview] ON [tblPlan].[PlanID] = [tblHealthAssessmentReview].[PlanID] P.S. - I know how illogical the DB structure sounds. I didn't make it. I inherited it and was then told to make it a "disconnected app."

    Read the article

  • SFTP, SCP, Secure Webdav: which is the most suitable ?

    - by Xavier Maillard
    Hi, currently, I am hosting a webdav share setup in order to store files I need anywhere I am. It is available via HTTPS. Things are that I do not need all the HTTP machinery -i.e. my nginx http server is only there for this webdav folder. I am not sure I made the best choice. My requirements on the client side are: secured transfers mountable as a network drive at work with 'near realtime sync' usable for any OS I could use (including my mobile (android)) At first, I chose webdav since it would pass through my work proxy (which refuses all that is not on HTTP/S (port 80 or 443)). Today, I am not satisfied with the setup and even if nginx memory footprint is pretty small, its webdav support is not really "clean" and full. What would you recommend between SFTP, SCP and the current webdav solution ? I think SFTP is the closest solution but I still have to find out how to pass through my proxy ;) SCP seems quite limited as I read about it (only file transfers if I read right). Cheers

    Read the article

  • A router that supports connecting with 2 different wifi networks

    - by Allan Deamon
    I Have the following setup in one place: We have a small local ISP through wireless. I have a external parabolic antenna, connected to a external usb wifi radio, connected through USB to a desktop old PC. The pc connects do the ISP wiki network, then do a Dial Up (PPPoE) connection through the this wifi setup. This will expand with others mobiles devices to be used. When I need, I take my home wireless router and connect though Ethernet in the PC, which is shares the internet. The problem is that the PC must be always ON and working. I would like to buy a wireless router which could be an AP to the mobile devices, notebooks, etc, as also could connect to the ISP Wifi/PPPoE network. So, this device must: Have one radio with detachable antenna to connect to the external antenna. It must connect as client to a network and then dial up the PPP Have another radio serving as AP (infrastructure) to the local place This can't be very expensive. I found a candidate: ( http://www.tp-link.com/en/products/details/?categoryid=1682&model=TL-WR2543ND ) It have 3 deatachable antennas, working with dual band. Officially, his firmware doesn't support it. My supposition: If internally there is 3 or 2 distinct wlan ports (like wlan0, wlan1), and there is support, i could use a OpenWRT, DD-WRT or Tomato to make this works. It also have 1 USB port, which I cold use to connect my actual USB Wifi card on it instead to the old PC. Another alternative, is a router that can do this out of box, with the original firmware. But I don't think this is a easy thing to find.

    Read the article

  • Advice on resizing 1280*720 for web audiences.

    - by jamiethompson90
    Forgive my spelling, I'm posting this from my mobile. I've recently decided to record videos to help teach a visual language. My camera likes to boast it can record in 1280; its a cheap camera about £75 so the quality isn't amazing. But its okay. Anyways, it has some other settings for lower res, but I figure might as well record in a larger size in case the need arises for a bigger source file in the future. I've been looking at jw player to play the converted files (mp4 to flv I think). What do you think a good size would be to convert to? I want to to look nice and clear remembering it is a visual language so lip patterns, facial expressions, body movement, fingers etc are all important, sound is not that important but I would like to have a choice to toggle captions. Thanks for any help, any advice apreciated, first time I have done a video project! P.s. If anyones interested its BSL. Jamie

    Read the article

  • Linux will not activate wireless after device has been re-enabled

    - by XHR
    Using a Eee 900A netbook by Asus. By pressing Fn + F2, I can disable or enable the wireless chip on the netbook, a blue LED indicates the status. I've been able to connect to wireless networks just fine with this netbook. However, if the wireless chip ever becomes disabled, I have to reboot to get my network connection back. This generally happens when suspending. For some reason the LED will be off and I have to hit Fn + F2 for it to light up again. However, after doing so, Linux will not reconnect to the network. It simply changes the wireless status from "wireless is disabled" to "device not ready". Even worse, I've recently had issues with the chip being enabled at boot, thus making it nearly impossible to get connected. I've searched around on-line but haven't found much of anything useful on this. This happens on all kinds of different distros including Ubuntu 9.10 Netbook, EeeBuntu 4 beta, Jolicloud and Ubuntu 10.04 Netbook. Edit I noticed this question is getting a lot of views. To give a quick update, I never did resolve this issue with the given distro's. However, I'm currently running Ubuntu 10.10 netbook edition and this issue has gone away.

    Read the article

  • Outdoor WiFi Mesh Topology vs. Repeaters

    - by IronJaxor
    Here's the current configuration in our organization (which I believe is incorrect): We have a number of Cisco 1500 series AP's (22 in total), that are mounted outdoors to provide seamless WiFi coverage over a large area. Each AP however has its own physical ethernet connection back to the WLC (All the AP's are marked as Root AP's). They are all broadcasting the same SSID. We have tried to stagger the channel selection but because there are only three non-overlapping channels to choose from, and in some areas the density of AP's is quite high, there is multiple places of channel interference. With this configuration we experience 100-150 disconnects from clients every day. (Our clients are mobile so they move throughout the coverage area constantly). My idea is to switch the AP's to the same channel thereby forming a wireless mesh, use the built in functionality of the 1500 series to use 802.11a as the backhaul, designate one or two AP's as root AP's and wire them back to the WLC. Thereby forming a WiFi mesh, which if I'm not mistaken is the point of the 1500 series in the first place! I am however completely new at WiFi networks and wondering if I am simply mistaken in what I believe my proposed changes will enable, or if there is a better way to tackle the WiFi topology.

    Read the article

  • Apache suddenly very slow on http and faster on https

    - by hsnm
    Background: I have Apache 2 running on ubuntu. There is a low usage on it and mostly being accessed for a web service URL from mobile apps. It was working fine until I installed SSL certificates. I now have both http and https. When I access the server using https, I get a fairly quick response (but probably not as fast as before). When I use http, it's so slow. What I tried: From this post: I curl localhost from the host and it takes some time, meaning there is no routing issue. The server runs on Amazon EC2 instance and is managed by me only. Also: I see that Apache once running, creates the maximum number of processes it is allowed to, which was not the case before. I lowered the MaxClients to 20 and I think I'm getting faster responses but it still takes over a minute and I always have MaxClients Apache processes. dmesg returns many [ 1953.655703] TCP: Possible SYN flooding on port 80. Sending cookies. When I netstat I get many entries with SYN_RECV. Possibly a DDoS attack? From EC2's monitoring diagrams I see a pattern of high "Maximum Network In (Bytes)" since 2 days ago. By the way the server is still being tested, the actual traffic is very low and not consistent. I tried to go with this solution to limit incoming connections using iptables, still no luck, but I'm trying. Question: What could be the problem? Is this a DDoS attack?

    Read the article

  • How to access XAMPP virtual hosts from iPad on local network?

    - by martin's
    Using XAMPP on one machine. Multiple virtual hosts defined. One per project. Format is .local For example: apple.local microsoft.local client-site.local our-own-internal-site.local All works perfectly from that one machine. I now want to have other systems within the network access the various sites. The main reason for wanting to do this is to be able to test site functionality and layout from mobile devices without having to upload partial work to public servers. I can access the main XAMPP default site by simply entering the IP address of the XAMPP machine in, say, Safari on an iPad. However, there is no way to reach .local that I can see. Would this entail setting up a DNS server within the network? We have a mixture of Windows and Mac machines. No Linux. The XAMPP machine is Vista 64. I don't want real external internet access to be affected in any way, just ".local" pointed to the XAMPP machine if that makes sense.

    Read the article

  • 'Future-proof' Live Audio Capture & Broadcast [migrated]

    - by maxpowers
    I'm looking to implement some live audio broadcasting functionality within a Ruby on Rails site for a client and was hoping I could get some input from people who have tackled this type of thing before. Essentially what I need to do is capture and record a user's audio (via microhpone, line in, etc), then stream that to 1,000+ listeners with very little latency, like sub 2 second if possible. So it looks like we've got 3 parts: Web-based audio capture (likely with Flash or JS) Server to accept audio feed and stream to listeners (likely Icecast or Wowza) Actual audio player (maybe HTML5 w/ Flash as a fallback? Maybe this jPlayer fork) Does RTMP makes sense here? Or maybe HTTP? What's the most 'future-proof' way to make this happen? Building with mobile in mind, but still want to be able stream to anyone. I've found lots of potentially helpful threads and software but I'm struggling to get an idea of how it all fits together. I'm a front end guy and way out of my comfort zone so if anyone has insights to offer, I'd love to hear them.

    Read the article

  • Magical moving desktop icons

    - by Nathan Taylor
    I have encountered a very strange behavior in Windows 7 that I cannot seem to identify and I have never seen or heard of on any system configuration. Whenever I move my mouse to the left-most edge of my primary display (centered in 3-display setup), my desktop icons magically move away from the cursor (up or down and to the right). It only happens when my desktop has focus and the mouse is positioned on the left, top or bottom edge of the main display. Moving the mouse all the way to the right edge of my right secondary display causes the mouse icons to snap back into their correct position. Ridiculous video of the issue My setup is 3 displays on two display adapters. The main display is running at 2560x1600, connected to the machine via a USB-powered DVI-D to DisplayPort adapter and is driven by an NVIDIA NVS 3100M video card. The secondary displays are running at 1440x900 and 1200x1920 and are driven by integrated Intel HD Graphics (mobile). It seems like some kind of panning behavior, but it's obviously not working as expected. I have updated all of my drivers, but no change. It's probably worth noting that the desktop icons are set to auto-arrange.

    Read the article

  • Javascript loading never completes on many sites

    - by Joe
    I recently moved country and have found that on many websites the page never finishes loading. In some cases, no content is ever displayed, but the loading will never time out. Loading Developer Tools in Chrome shows me that it is the Javascript files which never load. For example, this BBC article will never load compatability.js, though will load all the other JS files perfectly. Google Maps often fails to finish loading, meaning it's impossible to make searches. There seems to be no pattern to which files will fail to load (i.e. they don't come from the same CDN). I have tried Chrome, Safari and Firefox on OSX 10.8, and Chrome on my girlfriend's OSX 10.7. I have similar issues on the iPad. In many cases, if I can go to the mobile version of the page that seems to load fine. I have run the browsers in private mode, disabled plugins, updated flash, cleared the cache, flushed the DNS cache - though it would seem that if this is happening on other devices, none of this would work anyway. Is this an ISP issue? And if so, why would it be limited to certain JS files and not all? JS files from the same domain work fine, so I'm not really sure what I should be looking for.

    Read the article

  • gitweb refusing to blame

    - by Slipp D. Thompson
    I'm attempting to get gitweb (git 1.8.4.2, via git instaweb) in a project dir on my Debian server to offer blame views. In my /etc/gitweb.conf: … # default logo, favicon, etc. settings $feature{'blame'}{'default'} = [1]; $feature{'pickaxe'}{'default'} = [1]; $feature{'snapshot'}{'default'} = ['tgz', 'txz', 'zip']; $feature{'highlight'}{'default'} = [1]; $feature{'pathinfo'}{'default'} = [1]; In my global config file: [gitweb] blame = true snapshot = tgz, txz, zip patches = 256 avatar = gravatar [instaweb] local = false httpd = apache2 -f port = 4321 In my project's .git/config file: [gitweb] blame = true And yet, when I try to load a git blame view (via hand-modifying the URL to http://myserversip:4321/?p=.git;a=blame;f=Tests/InchCoordProxyTests.m;h=b4b2…;hb=53b4, since blame action links don't show up): Doing a quick search for “Blame view not allowed” in the gitweb.cgi source reveals plainly that the gitweb_check_feature('blame') conditional is failing. What am I doing wrong? Or, is there a way to verbosely print out why gitweb is doing what it's doing (e.g. which config files were read, which settings were loaded from each file, etc.)?

    Read the article

< Previous Page | 381 382 383 384 385 386 387 388 389 390 391 392  | Next Page >