Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 110/389 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • How do I set up anonymous email forwarder using cPanel?

    - by Gravitas
    Hi, Some companies demand your email address, then send you spam. I'm quite familiar with cPanel. How would I set up an anonymous email forwarder, so I can give them a valid email address, and kill that email address if the company turns into an evil spammer? Note that to be effective, it would have to filter out any email addresses listed in the body of the forwarded email (otherwise those email addresses will end up on their spam list too).

    Read the article

  • building a sms web application [closed]

    - by ramesh babu
    Possible Duplicate: How to add SMS text messaging functionality to my website? I would like to build a web application the purpose of the web site is to send and receive sms. I was researched so much but I didn't understood what are the requirements. simply i want he application similar to way2sms.com. and I dont want to buy sms from a company. I would like to build my own infrastructure. I have web designing stuff. I would like to know what are the requirements to send recieve sms and what is the infrastructure do i need to do it.

    Read the article

  • Best practise for meta tags in various languages

    - by Jack Lockyer
    We have a global site, all hosted on one .com domain (www.website.com/en www.website.com/es www.website.com/pt www.website.com/ru etc) each language sub directory is identical to one another (apart from being in different languages) My question is, should I translate each meta keyword for each page or just use the english versions? e.g. English page about private jets : keyword "private jet" French version of exactly the same page : keyword "private jet" or "jet privé" If anyone knows whether language specific keywords carry any weight in search engines when the actual website is a .com and not a country specific domain, that would be really helpful! Thanks in advance!

    Read the article

  • How can I allow robots access to my sitemap, but prevent casual users from accessing it?

    - by morpheous
    I am storing my sitemaps in my web folder. I want web crawlers (Googlebot etc) to be able to access the file, but I dont necessarily want all and sundry to have access to it. For example, this site (superuser.com), has a site index - as specified by its robots.txt file (http://superuser.com/robots.txt). However, when you type http://superuser.com/sitemap.xml, you are directed to a 404 page. How can I implement the same thing on my website? I am running a LAMP website, also I am using a sitemap index file (so I have multiple site maps for the site). I would like to use the same mechanism to make them unavailable via a browser, as described above.

    Read the article

  • How long should my Html Page Title Really be?

    - by RandomBen
    How long should my text within my <title></title> tags really be? I know Google cuts it off at some point but when? When I used IIS7's SEO Toolkit 1.0 I get error stating my title should be under 65 characters. I have a book by Bruce Clay that states I should use from 62-70 characters and roughly 9 +/- 3 words. I also have used SenSEO's Firefox Add-on and it states I should use a max of 65 characters or roughly 15 words. What is the max really? I have 2 sources saying 65 and 1 saying 72 but Bruce Clay is generally kept in high regard.

    Read the article

  • URL masking with .htaccess

    - by Michael Nguyen
    I need a hardcore programmer to help me with URL masking. So this is my situation. My website www.michaelfotograf.dk/blog/ is the main site that needs to be configured. I have another website/webhotel called www.umagepar.dk www.umagepar.dk is redirected to www.michaelfotograf.dk/blog/ The blog is an ongoing project where i post a lot of stuff to get my ranking higher on google. www.umagepar.dk which redirect to www.michaelfotograf.dk/blog/ is also a project on is own, so I do not want people to know, that it is a blog connected with www.michaelfotograf.dk/blog/ I therefore need to mask www.michaelfotograf.dk/blog/ so it will be called www.umagepar.dk in the URL in the searchbar at all times! What I need is a programmer that can do this for me. Name your price, and I'll see if I can affort it. Michael

    Read the article

  • What expectation should I have of South African web development rates / duration? [closed]

    - by Warren van Rooyen
    I am a developer but only intend on doing the front-end work for getting Reddit-like upvote / downvote functionality going on an upcoming site I'm building. I have never had to contract a developer for back-end work to implement code for me so I am quite in the dark on how much I should expect to pay and how long it could take to get the site going. I could be taken for a ride as the developer could distort the time it would take at a seemingly regular rate (hourly/day) or could otherwise distort their rate. Please could you give me help on this. I know you need some guidance on the nature of the site so here it is. I have a Reddit type template with CSS and PHP included. I then downloaded Pligg code that's intended to do the job of the Reddit upvote downvote functionality. How long would a developer roughly need to unite the theme and front end with the back-end functionality? I do understand it's not a lot of info but I'm sure you're experienced enough to have an instinct for the size of the project. Also, should I work on an hourly/day rate/ project payment agreement?

    Read the article

  • How can I host a website on a dynamically-assigned IP address?

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this: Convince my ISP to give me a static IP Assign my router my current IP to force a static IP (which might work?) Set my DNS record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Which of these options will work most reliably?

    Read the article

  • Can an expert examine my .NET MVC 4 application? [on hold]

    - by Till Death Developer
    Problem Definition: I need an expert to examine my application not for errors but have a look at how my implementation goes and tell me whether am doing a good job or am just creating a huge mess, and please me with suggestion on how i should improve my work? Points of Concern: Neat Solution(Can find the thing you are looking for easily). Low Redundancy. Efficiency (Load time, Speed, etc...) Data Access Implementation. Authentication System Implementation. Data Services Implementation. Note: Application is just a playground for testing new implementation approaches so it may seem meaningless because it is, however not the subject any way i just need to know if am doing things in a good way(Nothing is the right way but there is good and bad). Solution Link: http://www.mediafire.com/?8s70y44w16n1uyx

    Read the article

  • Weird unexpected image compression on a web server running Apache on Ubuntu?

    - by Billy Bob Thornton
    I have a weird problem on my production web server running Apache on Ubuntu: it compresses my images thereby dramatically lowering their quality! Actually I have two virtual hosts running, each located in a different folder. Wether I display .gif images by navigating on the two sites, or acceding them directly by their url, their size and quality are invariably degraded. I tried with three different browsers: same problem. Using them on other sites on the Web: no problem. Of course I disabled mod_deflate on the server (which should not compress images anyway), but the phenomenon remains. On my local développement server, running the same configuration, everything is Ok. Now I'm completely lost! For the record, my configuration: Ubuntu 10.04, Apache 2, Php 5.

    Read the article

  • Is MediaTemple's (gs) really worth the hassle? [closed]

    - by Andrew
    I have been hosting my sites with Dreamhost for a while, and although none of them are high-trafic atm, I am going to a launch a Rails app this summer and a couple of other stuff, so I need a serious host. Since my plan ends in a couple of days, I have been looking at alternatives, and because MT has such a good reputation in the webdesign world, I have been seriously considering paying the ridiculous 20$/month for its shared hosting services. That was until I actually read some reviews of it, most of which indicated it is slow and overpriced. So now I'm wondering whether switching over to (gs) would really be a good idea, or if I would be better off paying less money for something like a Site5 or Hostgator shared hosting plan. What is your experience with MT, and particularly their Grid Service? Do you think I should even switch to (gs) in the first place, or should I choose something else from its competitors?

    Read the article

  • Do Google's feed statistics include former users?

    - by jjackson
    I'm currently not using any sort of fancy stat tracking software such as feedburner, but I occasionally look at Google's stats in their Webmaster Tools just to get a rough idea of whether the number of subscribers is going up or down. This only gives the number of users subscribed through Google products, as they explain in their help documents: Subscriber stats display the number of Google users who have subscribed to your feeds using any Google product (such as Reader, iGoogle, or Orkut). Because users can subscribe to feeds using many different aggregators or RSS readers, the actual number of subscribers to your site may be higher. I used to use Google Reader very regularly but haven't opened it in a while now. The way I understand it, this will mean that even though I haven't touched any of those feeds in a long time I'm still technically subscribed to them and will therefore be included in Google's statistic. Is this correct? Also since Google runs Feedburner, does this have any effect on their stats as well?

    Read the article

  • Are there specific legal issues for web developers working on sex dating sites?

    - by YumYumYum
    Say I have created many ordinary websites which are not related to any dating/sexual content. Are the rules and regulations for a developer the same when making a sex-related dating site? I'm talking about a site where people meet together and get to know each other, with the intent of having a sexual relationship (you know what I mean), also featuring webcam sex, but not explicitly a porno site. Do such sites have any special legal issues for developers compared with non-sexual/dating sites?

    Read the article

  • How to Fix this specific Google "Fetch as Googlebot" error appearing on my Webmaster Tools?

    - by UXdesigner
    Good day, I'm currently finding out why I have lost all of my website's rank in google. I don't even appear in google results by the domain. But other sites do link me and they appear in the google results. I think it's all about leaving my site two months alone and finding out I had 20k in comment spam, which I completely deleted and fixed with filters and adding a new Disqus comment service. Thing is, I added my site to Google Webmaster Tools and I'm finding out several awful things. For example, when I click in Google Fetch As GoogleBot. I receive this error message below in response to my request. And I don't even know what's the real problem and how to fix it. I simply don't get it. This is what appears: Date: Wednesday, July 20, 2011 9:43:35 AM PDT Googlebot Type: Web Download Time (in milliseconds): 55 HTTP/1.1 403 Forbidden Date: Wed, 20 Jul 2011 16:43:36 GMT Server: Apache Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 248 Keep-Alive: timeout=2, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 403 Forbidden Forbidden You don't have permission to access / on this server. Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request. Do you guys know anything about this problem ? I need to have Google crawl my site again. I used to have a really nice google result in the past three years. Now, there's nothing. thanks,

    Read the article

  • Duplicate Content Problem due to plugin

    - by Amar Ryder
    Actually i am running website on wordpress where i have installed Transposh plugin on my site 'example'. Unfortunately, despite having English as the default language and therefore available at example.com/xxx, Google is indexing example.com/en/xxx so i m getting problem with duplicate content now i want to remove this plugin and links from google so that my content will be refine without getting duplicate content pages. Do you have any solution to do this safely. I think myself to remove this plugin from website, though it will create 404 errors from google links but i can add redirect code in htaccess till google would remove that "example.com/en/xxx " not found links. If you know any other healthy way to handle this please help me!

    Read the article

  • Comparison of phrases containing the same word in Google Trends

    - by alisia123
    If I compare three phrases in google trends : house sale house white house I get the following numbers: house - 91 sale house - 3 white house - 2 The question is: Is "sale house" and "white house" already included in the number 91? It is an important question, because if it is true, than: house_except_sale_house + sale_house = 91 sale_house = 3 Which means I have to compare 88 and 3, if I compare "house" and "sale house"

    Read the article

  • Understanding CTR in Google Webmaster Tools

    - by sam
    I've got a site that's showing a 9% CTR for a phrase in Google Webmaster Tools, but the average position for my site is 14th (this includes 7 local results for this phrase). I was a little confused as to what the CTR actually meant, is it : for each person who searches for that phrase 9% of them click my site. or for each person who actually sees my site in the search results 9% of them click through (bearing in mind 14th is high on page 2 when the local listings are used).

    Read the article

  • Preventing adult content in a forum

    - by John Doe
    I'm working on a forum that allows images attached to the posts and doesn't require registration. Thing is, I'd like to provide a work-safe navigation option in which the posts with porn images attached aren't shown. The ideas I've come up with are: Making the work-safe option the default and treating all posts with images attached as pornographic, and making them visible only if the user "unchecks" it. Making all posts with images attached not work-safe by default and changing their status to work-safe only after a moderator approved it. Only then they would be visible if the user has the "work-safe" option checked. Does anyone else have an idea? Also, how the big web services deal with this? (YouTube, CraigsList, even StackExchange). By the way, I don't think that "nudity detector" libraries are accurate and they give plenty of false positives and negatives. Thanks!

    Read the article

  • Customising Google Maps breaks highway label blocks

    - by user2248809
    I'm trying to customise a Google map to use shades of a particular colour. It's working nicely except the blocks that contain major road names / numbers is illegible. I've figured out how to target styles to those elements, but setting the 'color' value sets both text and background to that colour. And no adjusting of saturation, gamma, lightness etc seems to make the text legible. function initialize() { var latlng = new google.maps.LatLng(50.766472,0.284732); var styles = [ { stylers: [ { "gamma": 0.75 }, { "hue": "#607C75" }, { "saturation": -75 }, { "lightness": 0 } ] },{ featureType: "water", stylers: [ {color: "#607C75"} ] } ]; var myOptions = { zoom: 15, center: latlng, mapTypeId: google.maps.MapTypeId.ROADMAP, }; var marker = new google.maps.Marker({ position: latlng, title:"Living, dining, bedrooms by David Salmon" }); var map = new google.maps.Map(document.getElementById("map"), myOptions); map.setOptions({styles: styles}); marker.setMap(map); }

    Read the article

  • Which TLD would be suited to a personal site?

    - by Grant Palin
    I'm planning what is essentially a "business card" website for myself, and have been looking at domains. My primary site (a .com) is for my blog and portfolio and the like, but I want a separate site for basic information, contact, networks, and the like. In this light, there are a few TLDs that seem to be suitable: .info, .me, and .name. I'd appreciate remarks on the differences between these options, and suggestion on which would be suitable for my need.

    Read the article

  • IIS Not Accepting Login Credentials

    - by Dale Jay
    I have an ASP.NET web form using Microsoft's boilerplate Active Directory login page, set up exactly as suggested. (See http://msdn.microsoft.com/en-us/library/ms180890%28v=vs.80%29.aspx) Windows Authentication is activated on the "Default Website" and "MyWebsite" levels, and Domain\This.User is given "Allow" access to the site. After entering the valid credentials for This.User on the web form, a popup window appears asking me to enter my credentials yet again. Despite entering valid credentials for This.User (after attempting Domain\This.User and This.User formats), it rejects the credentials and returns an unauthorized user page. Active Directory user This.User is valid, the IP address of the AD server has been verified and SPN's have been set up for the server. Any thoughts as to what may be causing this? I can post code if needed.

    Read the article

  • How can I replicate Google Page Speed's lossless image compression as part of my workflow?

    - by Keefer
    I love that Google's Page Speed is able to losslessly compress a lot of my images, but I'd love to make it part of my workflow, prior to uploading a site and making it live. Is there anything I can run locally to give me the same lossless compression? I currently export images from Export For Web from Photoshop, and use a little application called PNGCrusher to reduce file size of PNGs. I'd love to find a faster way though than saving out and replacing the individual images from Page Speed's results.

    Read the article

  • Google analytics tracking example.com and www.example.com

    - by danferth
    Our website is set up to direct all traffic for www.example.com to example.com with a line in the htaccess file. With google analytics new in page analytics feature we are thinking of removing the line and allowing people to visit www.example.com as well to play with the new features. My question is this. How will this change affect our analytics data. -will nothing change and we can start using the new feature with our existing data -Are the two domains tracked separately and we will have to start over with www.example .com Any help would be great, as I can find nothing on googles help site covering this. Let me know if you need further explanation.

    Read the article

  • SEP Search Engine Placement

    - by Cory Baumer
    My employer was recently contacted by a company who convinced my employer that they could do what they called search engine placement and that they can guaranty that we will be placed on he top of googles search results. I immediately told my employer that it sounded like a scam, but I was made to contact the company regardless and investigate. The sales person insisted that SEP was different than the SEO and ad words campaigns we are already performing and that it was a cheaper way to be listed in the adwords section and that it didn't include a cost per click. It sounds to me like its kinda scammy like they are going to setup an adwords campaign and just charge us a flat rate that is higher than the cost per clicks. Has anyone heard of this and is it at all legitimate?

    Read the article

  • Authentication system brainstorm

    - by gansbrest
    Hi. We got multiple small websites (microsites) and one main high traffic one with big users base. Right now the requirement is to build authentication system which should allow users to loign with the same identity across the network. All website are running on different domains, powered by Drupal 6 CMS and have separate databases (so sharing tables with prefix is not an option + it creates a huge mess in the db). Here is the set of core requirements I came up with: Users should be able to login with the same credentials to all sites within the network User’s data sharing between Main site (storage) and all micro sites within the network Data synchronization across the network when user changes the data (update email or password for example) The login/registration process should be seamless and consistent Register on any of the sites across the network and use that identity to login later on. In the future there might be a need to add openid authentication options. Basically we are looking at something similar stackexchange does, but not sure if they have central users base on not. I was thinking about custom solution which will include 2 parts (modules), one will be stored on the Main site for users data storing and responding to requests from clients. Second part (module) will be placed on each microsite, which is going to send requests to the Master. Some kind of client - server setup. One of the complications I see right away is #3. Data Synhcronization across the network. I just don't want to reinvent the wheel and maybe some work is already done in this direction. Looking forward to your ideas on how to approach this project. EDIT: We use MySQL database

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >