Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 157/389 | < Previous Page | 153 154 155 156 157 158 159 160 161 162 163 164  | Next Page >

  • Jokes search engine / PHP based search engine on database

    - by matt_tm
    I'm looking for a script, functioning like the Google homepage that fetches data from a database rather than the internet. This is not intended to be a search engine, but a repository of jokes that can be pulled depending on the keywords typed. No sophisticated search techniques are required - keyword based is perfectly fine. If some mechanism of up/down-voting jokes can be incorporated, that would be fantastic, but I'm presuming that will be an entirely different game.

    Read the article

  • How to point DNS records to Amazon AWS Elastic Beanstalk

    - by Lance
    I just created a new environment with AWS and uploaded a .zip file into elastic beanstalk. I used to have a friend's server host my site instead of GoDaddy so I changed my custom DNS name servers from pointing to my friend's server to GoDaddy's. AWS told me that I need I need to add a CNAME record on GoDaddy and I did. The alias is lance and the host is lance-env.elasticbeanstalk.com. I know that this change can take 24-48 hours to take effect but it's been a day already and when I go to my site, a default page from GoDaddy appears. I'm very new to AWS and would just like to find a way for AWS to host my site other than using Route53.

    Read the article

  • What is the replacement for the Web Intents HTML standard?

    - by Tom
    "Web Intents" were deprecated in Chrome 24 (November/2011) and are no longer supported in any browser: We also gathered a lot of valuable data and feedback from our experimental support for Web Intents and decided to disable the feature in today's Beta release. Is there an HTML5 standard that I can look into as an alternative to what Web Intents intended? I'm interested in how web services can be stitched together. For example, imagine a website that can import a image from any number of web-services, modify the image in some way, then push the file back to any number of other web-services, all via HTML5 standards.

    Read the article

  • Is there a way of using HTTPS with Amazon's CloudFront CDN and CNAMEs?

    - by Metalshark
    We use Amazon's CloudFront CDN with custom CNAMEs hanging under the main domain (static1.example.com). Although we can break this uniform appearance and use the original whatever123wigglyw00.cloudfront.net URLs to utilise HTTPS, is there another way? Do Amazon or any other similar provider offer HTTPS CDN hosting? Is TLS and its selective encryption available for use somewhere (SNI: Server Name Indication)? Foot note: assuming that the answer is no, but just in the hope someone knows. EDIT: Now using Google App Engine https://developers.google.com/appengine/docs/ssl for CDN hosting with SSL support.

    Read the article

  • Monitoring JSON requests sent/received from the browser?

    - by Uwe Keim
    Having a website that generates and receives JSON requests via AJAX, I failed to find a tool that shows me live the communication including the content of the JSON calls. I thought that the Google Chrome developer tools or the IE 9 developer tools do have such a feature, but again, I failed. Searching Google, I failed too. So my question is: Is there a client-side tool to monitor the content of JSON requests that a website sends to the server?

    Read the article

  • Anti aliasing problem

    - by byronyasgur
    I am auditioning fonts on google web fonts and one that I was discounting was Ubuntu because it looked a bit jagged ( screenshot below taken straight from google); however afterward I read an article where it was mentioned as a good choice, and there was a screenshot where it looked really good ( to me anyway ). I am using windows 7 and have tried looking at it in chrome and firefox. I notice the same thing with some other fonts but this one is a good example because it looks perfect in the screenshot but not so good when I look at it on their site. I know this essentially is a question about setting my computer, but I thought that this would be the best place to pose the question: Is there something wrong with the settings on my machine seeing as it's obviously not showing the font the same on my computer as it did when the article writer downloaded it and used it in an image. The screenshot from Google ... The screenshot from the article above ...

    Read the article

  • Is it unwise to blacklist an IP address?

    - by hawbsl
    We have a form on a commercial website which has been abused (but only once or twice) by someone from a particular IP address. A colleague wants to blacklist that IP address from the website. Seems to me that's overkill, and that there's a risk that genuine customers sharing that same IP address would be blacklisted too. I suppose a big part of my question is how many people might be sharing that same IP address and could be affected by our blacklist. I suspect that's a "how long's a piece of string" question but some ballpark answer would be really helpful. We're in the UK if that's significant.

    Read the article

  • Is there a way to hide text from descriptions in Google

    - by Linda H
    The first line of text on all of our client's product pages is "Download hi-res images", which of course isn't what we'd want in the description when people search for their products. Is there any way to hide this text/link so that Google and the others just ignore it and go on into the text description below? I suppose we could use a meta-description, but the client isn't very good at computers and it's such a small site it seems silly.

    Read the article

  • How can I exclude content in my notifications bar from being indexed?

    - by Liam E-p
    Of course I want my content to be indexed pretty fast by search engines, however not my notifications bar. My notifications bar contains the last 30 changes to content on the site, and I don't want this to show in my SEO meta. As all the notifications are generic, it often doesn't provide any relevant information. As I said the notifications are generic. If an article named "123" was created, it would create a notification that says "Article "123" was created by xxx at 12:00AM". I'm now wondering if this is a content design problem. As only 1/3 of this information is actually relevant to users (the title, what happened). By SEO meta, and irrelevant notification data being shown, I mean this - Basically what I was wondering, is how I could optimise this, so search engines wouldn't show this generic nonsense.

    Read the article

  • Difference between mail. and pop. & smtp.?

    - by Lea Hayes
    When hosting a website I often notice that all of the following are defined under DNS: POP = mail.example.com SMTP = mail.example.com versus POP = pop.example.com SMTP = smtp.example.com Is it wise to use "mail.example.com" for both POP and SMTP when configuring a mail client? What is the difference between each of the two approaches? It seems to work fine (sends and receives mail as expected).

    Read the article

  • Where can I get resources to design a website like this? [closed]

    - by Jhon Andrew
    I have a project. To make a CMS for an online game. Can anyone suggest me where I can get resources that I can use like vintage borders, seamless old papers or any vintage like patterns, etc. I would like to come up with something like this website for example: http://www.gamezaion.com/ I hope someone would suggest and/or give me ideas, inspirations, and examples how I can come up to something like it. P.S.: I am getting a hard time designing, because I define my self as a developer not a designer.. Lol.

    Read the article

  • Repeating keywords in inbound links

    - by JJ_Jason
    Hy. I have a service similar to bit.ly. The link generation method is similar but the site is not. A user uses my site just like the mentioned bit.ly, but i offer a differnet kind of service for which i would want to rank (on Google) for. If i were to generate links such as: mysite.com/my-keywords/1Asdf34 would it be considered spammy or black hat? The same for bit.ly would be: bit.ly/url-shortening-services/3k1dS4sd For bit.ly it would defeat the purpose, but url length in my case does not have to be short.

    Read the article

  • How do Expires headers and cache manifest rules work together?

    - by Robert K
    I find the W3C's official Offline Web Applications specification to be rather vague about how the cache manifest interacts with headers such as ETag, Expires, or Pragma on cached assets. I know that the manifest should be checked with each request so that the browser knows when to check the other assets for updates. But because the specification doesn't define how the cache manifest interacts with normal cache instructions, I can't predict precisely how the browser will react. Will assets with a future expiration date be refreshed (no matter the cache headers) when the cache manifest is updated? Or, will those assets obey the normal caching rules? Which caching mechanism, HTTP cache versus cache manifest, will take precedence, and when?

    Read the article

  • Registering domains with Network Solutions

    - by Joel
    Few years ago I registered a domain with Network Solutions. In recent years I've been using cheaper services such as namecheap, powerpipe etc. Every time that I need to renew some of the older domains with Network Solutions I am surprised at how much expensive they are. What is the reason for the price differences between the services? Why should I use a service like Network Solutions if there are so many companies out there that offer domain registration for a very cheap price? Thanks, Meir

    Read the article

  • Best Option for Creating A Small Church Website

    - by Jim
    I've been asked to create a website for a small church. Their prior site was hosted on geocities which is no longer. They are not looking for anything robust, just an informational site with a calendar and maybe a contact form. The church would also like to be able to administer the site with little technical know-how. Cost is also an issue. Given these requirements, something like sites.google.com seems like a good option. However, my main concern is that Sites will suffer the same fate as geocities. It is definitely not a flagship Google product. Are there other good alternatives that fit the requirements?

    Read the article

  • .html extension or no for SEO purposes

    - by Scott Schluer
    I know this question has been asked before on Stack Overflow, but what I have not been able to find in the posts I've read are concrete references as to WHY one is better than the other (something I can take to my boss). So I'm working on an MVC 3 application that is basically a rewrite of the existing production application (web forms) using MVC. The current site uses a URL rewriter to rewrite "friendly" urls with HTML extensions to their ASPX counterpart. i.e. http://www.site.com/products/18554-widget.html gets rewritten to http://www.site.com/products.aspx?id=18554 We're moving away from this with the MVC site, but the powers that be still want the HTML extension on the URLs. As a developer, that just feels wrong on an MVC site. I've written a quick and dirty HttpModule that will perform a 301 redirect from the .html URL to the same URL without the .html extension and it works fine, but I need to convince management that removing the .html extension is not going to hurt SEO. I'd prefer to have this sort of friendly URL: http://www.site.com/products/18554-widget Can anyone provide information to back up my position or am I actually trying to do something that WOULD hurt SEO, in which case can you provide references on that?

    Read the article

  • Opinions on .gr (Greek) registrars?

    - by Marc Bollinger
    None of the previous questions tackle some of the one-off (or further) countries' registries, beyond .co.uk, .it, et al. or else I'd have found an answer myself. I'm just looking for information for a vanity domain, so obviously I'm alright without an answer, but it's an unasked question (or at least, unanswered), and I'm not exactly in a hurry to give my credit card information over country lines, sight unseen.

    Read the article

  • I Need a recommendation for a CMS application with ECommerce

    - by Griff
    Does anyone have any recommendation for an open source solution for a robust CMS application that has a fully featured ECommerce module? I have been looking into Drupal with Ubercart -- but it looks like Ubercart is not fully up to speed with Drupal 7, and the other modules for Ecommerce don't look as robust. The CMS system should support CMIS as both client and server, and be able to run in a cloud computing environment. The system could be written in any standard web programming language, although Java would be my preference. I'm posting this question here because it seems that all CMS systems provide ECommerce as an afterthought, rather than a core feature.

    Read the article

  • What are the hard and fast rules for Cache Control?

    - by Metalshark
    Confession: sites I maintain have different rules for Cache Control mostly based on the default configuration of the server followed up with recommendations from the Page Speed & Y-Slow Firefox plug-ins and the Network Resources view in Google's Speed Tracer. Cache-Control is set to private/public depending on what they say to do, ETag's/Last-Modified headers are only tinkered with if Y-Slow suggests there is something wrong and Vary-Accept-Encoding seems necessary when manually gziping files for Amazon CloudFront. When reading through the material on the different options and what they do there seems to be conflicting information, rules for broken proxies and cargo cult configurations. Any of the official information provided by the analysis tools mentioned above is quite inaccessible as it deals with each topic individually instead of as a unified strategy (so there is no cross-referencing of techniques). For example, it seems to make no sense that the speed analysis tools rate a site with ETag's the same as a site without them if they are meant to help with caching. What are the hard and fast rules for a platform agnostic Cache Control strategy? EDIT: A link through Jeff Atwood's article explains Caching in superb depth. For the record though here are the hard and fast rules: If the file is Compressed using GZIP, etc - use "cache-control: private" as a proxy may return the compressed version to a client that does not support it (the browser cache will hold files marked this way though). Also remember to include a "Vary: Accept-Encoding" to say that it is compressible. Use Last-Modified in conjunction with ETag - belt and braces usage provides both validators, whilst ETag is based on file contents instead of modification time alone, using both covers all bases. NOTE: AOL's PageTest has a carte blanche approach against ETags for some reason. If you are using Apache on more than one server to host the same content then remove the implicitly declared inode from ETags by excluding it from the FileETag directive (i.e. "FileETag MTime Size") unless you are genuinely using the same live filesystem. Use "cache-control: public" wherever you can - this means that proxy servers (and the browser cache) will return your content even if the rest of the page needs HTTP authentication, etc.

    Read the article

  • Best scripting language for project [on hold]

    - by Dave
    This is a subjective question, but I don't know where else to ask it. I'd appreciate it if someone could direct me to an appropriate scripting language for my project. I'm a little new at this so I'd appreciate any help. The project is a website that will display a list of photo subject groups (such as "nature" "people" "sports" etc) on the home page. The photos will all be in subdirectories of the main photo directory (photos) and each subject group will represent a subdirectory in photos. For example in directory photos there might be 3 subdirectories, "nature" "people" "sports" and in each of those subdirectories there will be the actual photos. The idea is that when the website owner wants to update/add/delete a subject group all he has to do is add, delete or update a subdirectory of the photos directory. This means, I think, that I need a scripting language that can read the directories and files in the website and then send a web page with the information in it. What is the simplest and easiest scripting language to do this in? Any ideas? Thanks

    Read the article

  • Evidence for automatic browsing - Log file analysis

    - by Nilani Algiriyage
    I'm analyzing web server logs both in Apache and IIS log formats. I want to find the evidence for automatic browsing, like web robots, spiders, bots, etc. I used python robot-detection 0.2.8 for detecting robots in my log files, but I know there may be other robots (automatic programs) which have traversed through the web site but robot-detection can not identify. So I want to ask: Are there any specific clues that can be found in log files that human users do not leave but automated software would? Do they follow a specific navigation pattern? I saw some requests for favicon.ico - does this implicate that it is a automatic browsing?. I found this article and this question with some valuable points.

    Read the article

  • Separate urls for a set of pages sharing 80% duplicate content

    - by user131003
    Issue: Currently my site has one particular page which has country specific data. So I've URLs like : mysite.com/sale-united-states mysite.com/sale-united-kingdom mysite.com/sale-sweden etc. All these pages have 80-90% common content and 10-20% country specific content. currently all these pages canonically point to mysite.com/sale-united-states. The problem is when someone searches for "sale Sweden", Google correctly shows mysite.com/sale-united-states page, which does not feel correct as it shows US page instead of Sweden. Now I'm thinking of not using canonical url so that country specific urls are produced in Google saerch. But I'm not sure how 80% duplicate content is going to affect SEO? What should be the recommended approach for this situation? A friend of mine suggested a "separate subdomain per country" based approach but it seems overkill for one page.

    Read the article

  • Adsense: You have rejected ad requests, which will result in lost revenue

    - by Chankey Pathak
    Got an alert on my adsense account which says You have rejected ad requests, which will result in lost revenue. The following ad units have made ad requests with incorrect site information. This occurs when the URL of the server from which the ad unit has been served differs from the URL of the actual page where the ad will be displayed. Learn how to fix these errors. So the solution is that I'll have to use "google_page_url = "http://myurl.com/fullpath";" I'm using wordpress, what should be the URL for google_page_url? For example my website is www.technostall.com. Should I put www.technostall.com there or should I give the path of each post? That is not good because I'm using a sidebar widget for sidebar ad unit. I can't change google_page_url for each page. What should I do? This error is appearing only on my sidebar/navigation ad units. Is using google_page_url = document.location; fine?

    Read the article

  • Looking for Non Hosted Audio & Video Podcasting Solution for Church Websites

    - by motboys
    I am looking for a solution that will do the following: User uploads audio and/or video files with title, desc. image etc Solution embeds info into ID3 tags Solution generates RSS feed Solution embeds new content in our website Content on website is searchable This is for a couple of church websites I manage. I am looking for the ability to do the above with a sermon mp3 and also a video. At the moment we are doing it with multiple steps / people involved and I want to automate the process. I can't seem to find a solution that does all of the above. Thank you!

    Read the article

  • First Project a big one, How much should we charge?

    - by confuzzled
    Two of my cousins and I started a freelance computer repair/web design business just to make some money on the side during college, and received our first major web design project about three weeks ago. Now we've created websites before, but it was mostly for family businesses and have never really charged money, and most of the websites have been static, and don't really require a CMS. This project, however, was a big one (for us anyways). We created a news site that had several categories, we created the banners, we created a classifieds page (not a web app just something static that they control). Several links, a few graphical assets, CSS drop down menu, RSS feed from a different news site, weather, all the normal stuff you would find on a regular news site. On top of that we put in all the usual Joomla stuff (search, Jcomments, Jslide pictures, JCE, etc.). Then we uploaded the first 10 articles they gave us, and we are going to train them how to use Joomla. Now, at first we decided for 700 dollars. I assumed they just wanted a simple blog like website where they can upload articles. But then we had a meeting, and they asked for a lot more. Note: we did not hard code the template from scratch, but customized the gantry framework to fit their needs. We did code quite a bit however. I estimate that we put in about 50-60 hours in total. I'm wondering if 700 dollars is a bit low, this price is definitely not set in stone. Please keep in mind that this is our first project, and we are newbies, please be kind. Thank You!

    Read the article

< Previous Page | 153 154 155 156 157 158 159 160 161 162 163 164  | Next Page >