Search Results

Search found 9727 results on 390 pages for 'llblgen pro'.

Page 111/390 | < Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >

  • Weird unexpected image compression on a web server running Apache on Ubuntu?

    - by Billy Bob Thornton
    I have a weird problem on my production web server running Apache on Ubuntu: it compresses my images thereby dramatically lowering their quality! Actually I have two virtual hosts running, each located in a different folder. Wether I display .gif images by navigating on the two sites, or acceding them directly by their url, their size and quality are invariably degraded. I tried with three different browsers: same problem. Using them on other sites on the Web: no problem. Of course I disabled mod_deflate on the server (which should not compress images anyway), but the phenomenon remains. On my local développement server, running the same configuration, everything is Ok. Now I'm completely lost! For the record, my configuration: Ubuntu 10.04, Apache 2, Php 5.

    Read the article

  • How can I host a website on a dynamically-assigned IP address?

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this: Convince my ISP to give me a static IP Assign my router my current IP to force a static IP (which might work?) Set my DNS record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Which of these options will work most reliably?

    Read the article

  • Research about best way to present multiple products on one page

    - by Michael Dibbets
    I am updating a webshop page. This is a fairly simple page that displays all the products that we currently sell. The page in development is visible here ( https://www.ortho.nl/wwebshop ). Now I was curious, and since I can't find anything via google etc..(probaly don't know the right keywords) what the best way is to present multiple products on one page. Should you use borders? Should you use colours? Which colours? what kind of tweaks will direct the customers attention to the right place? Does anyone know from experience or via research(and could you point me in the right direction to find that research) what the best way to present products is so conversion/clickthrough is optimised?

    Read the article

  • firefox addon to save web page as pdf [closed]

    - by Jayapal Chandran
    Is there a firefox addon to save a webpage as a pdf file? I want to free service if available. In chrome save as pdf works after pressing Ctrl + P but this services is not available in firefox. You may ask why not use chrome. I am using yslow to generate reports and yslow does not show the printable view option were as firefox show it. But firefox does not have print/save as pdf but chrome does save as pdf.

    Read the article

  • SEP Search Engine Placement

    - by Cory Baumer
    My employer was recently contacted by a company who convinced my employer that they could do what they called search engine placement and that they can guaranty that we will be placed on he top of googles search results. I immediately told my employer that it sounded like a scam, but I was made to contact the company regardless and investigate. The sales person insisted that SEP was different than the SEO and ad words campaigns we are already performing and that it was a cheaper way to be listed in the adwords section and that it didn't include a cost per click. It sounds to me like its kinda scammy like they are going to setup an adwords campaign and just charge us a flat rate that is higher than the cost per clicks. Has anyone heard of this and is it at all legitimate?

    Read the article

  • building a sms web application [closed]

    - by ramesh babu
    Possible Duplicate: How to add SMS text messaging functionality to my website? I would like to build a web application the purpose of the web site is to send and receive sms. I was researched so much but I didn't understood what are the requirements. simply i want he application similar to way2sms.com. and I dont want to buy sms from a company. I would like to build my own infrastructure. I have web designing stuff. I would like to know what are the requirements to send recieve sms and what is the infrastructure do i need to do it.

    Read the article

  • Do Spambots have access to unlimited IP addresses?

    - by Reg Gordon
    I have been attacked for weeks by the same spambot trying to brute force the login page. I have a login security module now installed on my Drupal 6 website and it bans on IP after x amount of attempts. It's been going on for ever and I have banned about 1000 IP addresses. Is there any point in me banning on IP due to the spambot having access to unlimited IP addresses or will they run out of them eventually?

    Read the article

  • Authentication system brainstorm

    - by gansbrest
    Hi. We got multiple small websites (microsites) and one main high traffic one with big users base. Right now the requirement is to build authentication system which should allow users to loign with the same identity across the network. All website are running on different domains, powered by Drupal 6 CMS and have separate databases (so sharing tables with prefix is not an option + it creates a huge mess in the db). Here is the set of core requirements I came up with: Users should be able to login with the same credentials to all sites within the network User’s data sharing between Main site (storage) and all micro sites within the network Data synchronization across the network when user changes the data (update email or password for example) The login/registration process should be seamless and consistent Register on any of the sites across the network and use that identity to login later on. In the future there might be a need to add openid authentication options. Basically we are looking at something similar stackexchange does, but not sure if they have central users base on not. I was thinking about custom solution which will include 2 parts (modules), one will be stored on the Main site for users data storing and responding to requests from clients. Second part (module) will be placed on each microsite, which is going to send requests to the Master. Some kind of client - server setup. One of the complications I see right away is #3. Data Synhcronization across the network. I just don't want to reinvent the wheel and maybe some work is already done in this direction. Looking forward to your ideas on how to approach this project. EDIT: We use MySQL database

    Read the article

  • Google analytics tracking example.com and www.example.com

    - by danferth
    Our website is set up to direct all traffic for www.example.com to example.com with a line in the htaccess file. With google analytics new in page analytics feature we are thinking of removing the line and allowing people to visit www.example.com as well to play with the new features. My question is this. How will this change affect our analytics data. -will nothing change and we can start using the new feature with our existing data -Are the two domains tracked separately and we will have to start over with www.example .com Any help would be great, as I can find nothing on googles help site covering this. Let me know if you need further explanation.

    Read the article

  • Preventing adult content in a forum

    - by John Doe
    I'm working on a forum that allows images attached to the posts and doesn't require registration. Thing is, I'd like to provide a work-safe navigation option in which the posts with porn images attached aren't shown. The ideas I've come up with are: Making the work-safe option the default and treating all posts with images attached as pornographic, and making them visible only if the user "unchecks" it. Making all posts with images attached not work-safe by default and changing their status to work-safe only after a moderator approved it. Only then they would be visible if the user has the "work-safe" option checked. Does anyone else have an idea? Also, how the big web services deal with this? (YouTube, CraigsList, even StackExchange). By the way, I don't think that "nudity detector" libraries are accurate and they give plenty of false positives and negatives. Thanks!

    Read the article

  • Comparison of phrases containing the same word in Google Trends

    - by alisia123
    If I compare three phrases in google trends : house sale house white house I get the following numbers: house - 91 sale house - 3 white house - 2 The question is: Is "sale house" and "white house" already included in the number 91? It is an important question, because if it is true, than: house_except_sale_house + sale_house = 91 sale_house = 3 Which means I have to compare 88 and 3, if I compare "house" and "sale house"

    Read the article

  • Which TLD would be suited to a personal site?

    - by Grant Palin
    I'm planning what is essentially a "business card" website for myself, and have been looking at domains. My primary site (a .com) is for my blog and portfolio and the like, but I want a separate site for basic information, contact, networks, and the like. In this light, there are a few TLDs that seem to be suitable: .info, .me, and .name. I'd appreciate remarks on the differences between these options, and suggestion on which would be suitable for my need.

    Read the article

  • How to make background image black-and-white?

    - by Dmitri
    Are there any filters that would make background image to be displayed as black-and-white? What I mean, is this: i have a background image set via css using background: url(/image.png); But now I need to apply a filter so that an image is shown as black-white only. Ideally, I would like to also apply opacity to it. The effect I really trying to achieve is to have background image black/white and on hover over that span element the filter would be removed, revealing the color version. And of cause it has to work in FF, Chrome, IE Can someone help me?

    Read the article

  • Customising Google Maps breaks highway label blocks

    - by user2248809
    I'm trying to customise a Google map to use shades of a particular colour. It's working nicely except the blocks that contain major road names / numbers is illegible. I've figured out how to target styles to those elements, but setting the 'color' value sets both text and background to that colour. And no adjusting of saturation, gamma, lightness etc seems to make the text legible. function initialize() { var latlng = new google.maps.LatLng(50.766472,0.284732); var styles = [ { stylers: [ { "gamma": 0.75 }, { "hue": "#607C75" }, { "saturation": -75 }, { "lightness": 0 } ] },{ featureType: "water", stylers: [ {color: "#607C75"} ] } ]; var myOptions = { zoom: 15, center: latlng, mapTypeId: google.maps.MapTypeId.ROADMAP, }; var marker = new google.maps.Marker({ position: latlng, title:"Living, dining, bedrooms by David Salmon" }); var map = new google.maps.Map(document.getElementById("map"), myOptions); map.setOptions({styles: styles}); marker.setMap(map); }

    Read the article

  • Green Website Design

    - by Christofian
    This is kindove a strange question, but... There was a site called Blackle ( http://www.blackle.com/) which "claimed" to save energy by using a black background (it doesn't: see here: http://skeptics.stackexchange.com/questions/4373/how-much-energy-does-displaying-a-webpage-with-a-black-background-actually-save). However, blackle and it's idea of "green website design" interested me, and I was wondering if there are any ways to design an energy saving website that actually save energy. If anyone knows of any, please post them here. If nobody has any, then I guess there isn't a way to save energy through website design...

    Read the article

  • What Ranking Factors Are Used For International Search?

    - by Itai
    Google.com vs Google.ca vs Google.co.uk (etc) all rank their results differently. The intention is to return more locally-relevant content. What factors, other than the ones below, are used to determine local relevancy? I already know the TLD (.com, .ca, etc) and likely the server IP address is used but there has to be more as this would not explain some search results I noticed this week. Particularly, I see a US-based site ranking #3 for some keywords on Google.com, ranking #5 on Google.ca and not ranking within the first pages on Google.co.uk. On Google.com it outranks a Australian site which outranks it on Google.ca. The site itself is relevant for all English-speaking locations and it being outranked by sites from different regions on different Google TLDs (but not ones from the same region as the TLD).

    Read the article

  • Move site to new domain divided by language across subdomains

    - by mark
    I managed to find a nice domain for a fairly fledgling site of mine that actually hasn't been parked by scumbag squatters. Given the upcoming move I'm thinking I'd take the opportunity to split the content across subdomains according to language, much like wikipedia for example: current: www.old-domain.com/en/subject # English www.old-domain.com/subjecto # Spanish (default so not locale in url) proposed en.new-domain.com/subject es.new-domain.com/subjecto The advantage of doing this is a fairly competitive keyword such that I may wish to put a copy of my application on a Spanish slice in order to gain a few serp's. Also pure vanity. Google's webmaster tools allows me to move to the new domain and I can add the root domain and the subdomains but forward to only one. I'll 301 from the old domain appropriately but is there anything I should know about webmaster tools in this respect where effectively I'm moving to two addresses? (Feel free to dissuade me from doing this if it's a bad idea in comments.)

    Read the article

  • How can I replicate Google Page Speed's lossless image compression as part of my workflow?

    - by Keefer
    I love that Google's Page Speed is able to losslessly compress a lot of my images, but I'd love to make it part of my workflow, prior to uploading a site and making it live. Is there anything I can run locally to give me the same lossless compression? I currently export images from Export For Web from Photoshop, and use a little application called PNGCrusher to reduce file size of PNGs. I'd love to find a faster way though than saving out and replacing the individual images from Page Speed's results.

    Read the article

  • IIS Not Accepting Login Credentials

    - by Dale Jay
    I have an ASP.NET web form using Microsoft's boilerplate Active Directory login page, set up exactly as suggested. (See http://msdn.microsoft.com/en-us/library/ms180890%28v=vs.80%29.aspx) Windows Authentication is activated on the "Default Website" and "MyWebsite" levels, and Domain\This.User is given "Allow" access to the site. After entering the valid credentials for This.User on the web form, a popup window appears asking me to enter my credentials yet again. Despite entering valid credentials for This.User (after attempting Domain\This.User and This.User formats), it rejects the credentials and returns an unauthorized user page. Active Directory user This.User is valid, the IP address of the AD server has been verified and SPN's have been set up for the server. Any thoughts as to what may be causing this? I can post code if needed.

    Read the article

  • Google is not indexing my entire site despite having a sitemap

    - by Anusha
    I have an e-commerce website www.beyondtime.in. I have been constantly monitoring Googlebot crawling on my website and my webmaster account. Lately, I have found two issues that I have not been able to understand. 1.) The Google Bots have been only crawling www.beyondtime.in/telecom.php when the URL is not even valid. What needs to be done to let Google crawl other pages of the website as well? 2.) The second question is about the Google Webmaster account, where I've submitted my sitemap with 227 URLs. Out of that, only 156 have been indexed. None of the images of my website have been indexed by Google.

    Read the article

  • How to spread XML Sitemaps over several webservers behind AWS loadbalancer?

    - by Jurik
    We have a web portal with almost a million products and way more other urls. I wrote a script that checks database. If there is a new url needed or an old one update, this script will update/create the XML Sitemaps. But we have several servers behind the load balancer at our rented AWS space. Further this script checks database for each url if there was an update so that it updates the appropriate xml file too. My question is how to spread those XML Sitemaps over all webservers behind this AWS load balancer? Our approaches/ideas: we could just generate them on one server with a cron job and copy them to the other servers, but this could be difficult because of automatic raising numbers of servers and so on. we put them on our S3 - but this one is not avaible thru our domain, so I guess google will have a problem with it I let my script run on every webserver but change it in a way that it will generate each time all xml files if they do not exist. But then I would have conflicts with updated URLs in my database, where I saved timestamp of last changed value of every url Is there another - better - solution that I do not know? Are there any special services by amazon for such cases?

    Read the article

  • Making profit from a social network

    - by James P.
    This follows similar questions but I'd like to see if anything particular comes out of it due to the nature of site. In short, I've taken up the role of webmaster for a small social network site and wish to make it profitable to at least cover the running costs. The site is linked to a commerce and presents are offered to members according to the number of points they've accumulated through various actions. The site is running on shared hosting so it's probably dirt cheap but the presents can be expensive as a whole and some money has already been invested into the project. One idea I have is to seek some sponsors that would be willing to offer presents or special offers in return for publicity. I don't know if this will be easy or not. I'm also looking into adapting hosting to perhaps move static files to a cheaper online storage medium (see Ideas for reducing storage needs and/or costs (lots of images)). Other suggestions are welcome.

    Read the article

  • How to create sitemap for my shopping site?

    - by John Sanjay
    I have one shopping site related to Home Goods and I need to create and submit the sitemap of my site in Google Webmaster Tool. I know there are several online tools to generate XML sitemap but some one told me that, Shopping site's sitemaps are different than other sites which means we have to submit sitemaps in two format. One is static page site map and another one is dynamic product page sitemap. Is it true? If so how create sitemaps in these two formats?

    Read the article

  • Is real estate boom again in India?

    - by skzameen
    The present real estate scenario in India is very good. The real estate boom in India is interlinked directly to the industrial, Commercial and economic growth with stability and strong presence of international companies throughout India have made the preferred destination for investment in real estate sector. India is a one of the fast growing economic stock markets For more information about residential and commercial projects or properties log on to www.zameen-zaidad.com, or email to [email protected] Contact Us Zameen-zaidad.com Ph: - +91-11-40024002 M: - +91-9810445860 Share your opinion for www.zameen-zaidad.com portal

    Read the article

  • Using YouTube as a CDN

    - by Syed
    Why isn't YouTube used as a CDN for video and audio files? Through YouTube's api and developer tools, it would be possible to post all media files to YouTube from a CMS and then make a call to them when needed. This seems like it's within YouTube's TOS, it's a cost-effective way to store, retrieve, and distribute media files, and it could also make for easy monetization. I ask because I'm working on a new project for a public radio station. I can't figure out the real downside to this sort of an implementation.

    Read the article

< Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >