Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 184/528 | < Previous Page | 180 181 182 183 184 185 186 187 188 189 190 191  | Next Page >

  • Revamped Joomla site to Google search engine

    - by user3127632
    I am about to upload a revamped site of Joomla (update from 1.5 to 2.5 + changes). I currently have a test bed subdomain that I am currently working on. In few days I am about to do the swap and replace the old site with the new one. I am worrying about Search Engines and specifically Google. The site currently has a very good rank (appears 2nd in the search), what actions do I have to take in order to be updated and preserve the rank? (except submitting the new sitemap I guess). It's not a difficult task but because I don't have the option to be wrong or mistakes to be done I an asking for a more "expert" advice.

    Read the article

  • Can 302 redirect on wordpress homepage affect SEO?

    - by user2402116
    I manage a website. In order to lead the visitor to the latest post when entering the website, I set a 302 redirect from the homepage URL to the latest post URL. There's 1 to 2 posts a month so the redirection is made manually each time a new post gets published. Despite the website registration on google it doesn't get indexed at all. After searching for some solution on blogs and forums I understood this issue might be related to the 302 redirection. It's the first time I use redirections on a website, can someone help me?

    Read the article

  • Best way to redirect users back to the pretty URL who land on the _escaped_fragment_ one?

    - by Ryan
    I am working on an AJAX site and have successfully implemented Google's AJAX recommendation by creating _escape_fragment_ versions of each page for it to index. Thus each page has 2 URLs: pretty: example.com#!blog ugly: example.com?_escaped_fragment_=blog However, I have noticed in my analytics that some users are arriving on the site via the "ugly" URL and am looking for a clean way to redirect them to the pretty URL without impacting Google's ability to index the site. I have considered using a 301 redirect in the head but fear that Googlebot might try to follow it and end up in an endless loop. I have also considered using a JavaScript redirect that Googlebot wouldn't execute but fear that Google may interpret this as cloaking and penalize the website. Is there a good, clean, acceptable way to redirect real users away from the ugly URL if for some reason or another they end up arriving at the site that way?

    Read the article

  • with dash or not in domain [closed]

    - by menardmam
    Possible Duplicate: Is it better to put hyphens in a domain name ? I have to buy a domain it can be : somebigcompany.com or some-big-company.com I know, it's not a sample answer... but i like to know your point of view... the client what with no dash, i think for Google, and readability the dash is better... i have talk to a expert SEO that sait since panda update of the google algorithm it will be punish to have dash.. i don't believe it ! after your answer, i go to godaddy to buy it thanks in advance

    Read the article

  • facebook internal search by google [migrated]

    - by Alexis
    I am currently working on a challenge; basically using google api; I would search for facebook fan pages; most specifically the "about" timeline for the email add and the date the business was founded. So far I have come to this: site:facebook.com/pages + "business type" + "country" + "@email.com" I need to add something else so that it can give me back the date it was founded in. If you see the facebook fan page in the about section; there is for e.g (Founded 06/02/2010) The bracketed info above is what I need to add to succeed in adding; any idea?

    Read the article

  • clean urls using .htaccess

    - by Napster
    I am trying to implement clean urls using .htaccess. Basically after searching for some time I found out this code RewriteRule latestnews/([a-zA-Z0-9]+)/$ http://thinkmovie.in/index.php/latestnews/?nid=$1 [L] RewriteRule latestnews/([a-zA-Z0-9]+)$ http://thinkmovie.in/index.php/latestnews/?nid=$1 [L] so when I try to access the following url http://thinkmovie.in/index.php/latestnews/272 it redirects to http://thinkmovie.in/index.php/latestnews?nid=272 But what I want is to retain the url in the browsers address bar as http://thinkmovie.in/index.php/latestnews/272

    Read the article

  • how execute mysql command DELIMITER

    - by user5332
    hi, I have huge problem (for me) I need from PHP execute mysql command DELIMITER | but mysql_query fails on error... and I found that mysql_query doesn't support usage of DELIMITER, because this command may be working only in mysql console but when I open phpMyAdmin ... is there at SQL tab an option to change DELIMITER and it works... but I don't know how... could you help me? who is possile to change delimiter from PHP? I need it to do before CREATE TRIGGER ... that uses several ; that may not be interpreted like command end

    Read the article

  • is a merchant account a requirment for a website to take payments

    - by calum
    I have had a quick look but couldn't see anything related. Basically, if we were to accept payments for events on our website, via paypal (essentially a Buy it now! button), as a business, do we need a merchant's account, or will a regular bank account be acceptable? I may have some confusion in terms. My understanding is you need a merchant's account to accept credit card payments, but as we are using PayPal, is this necessary? Thank you for any clarification. disclaimer - I've read What are some options for taking payments on my website? but it doesn't explicitly say if we require a merchant account or not. Thank you.

    Read the article

  • 301 redirects mirrored domain

    - by Dave
    I'm redesigning a site for a friend on my localhost. His old site is an .asp based site and we're replacing it with a WordPress site on LAMP hosting. The old site sits on domain A and also has another domain, domain B parked on top of it mirroring it. Google has picked up domain B for most of his search engine results and yahoo and bing etc have picked up domain A. The plan is to 301 redirect the the old pages of his site on domain A to the new WordPress versions and park domain B on top of it like before. My question is, will this work, if not what would be a better way to approach it? We'd prefer not to lose any of the search engine listings in the redesign, and the search engines don't appear to have penalized him for duplicate content. Thanks very much in advance!

    Read the article

  • What kind of spam is this?

    - by SSilk
    I realize this is a pretty vague question, but I occasionally get spam messages through my contact form on a Drupal 6 site. The contact form does not have any anti-spam protection (i.e. math question). The messages I get are all very similar and just jumbled junk, like below, so I think they're all from the same source. Example: ylsaf0V bpsdfuxnhjjd, [url=http://wwgfsggzgyjyjm.com/]wwgrfgzrgsjyjm[/url], [link=http://xmgvyghcuufvb.com/]xmjyhvyjyfjirovb[/link], http://frgxmdghrgruhfc.com/ Anyway, I'm just wondering what the point of such a message is. All the links are dead, it's illegible, and it's not trying to sell me a product or get me to do anything, so I'm a bit perplexed. Is there any way to tell where they're coming from? And how concerned should I be? To be clear, I'm not asking how to avoid them, I realize just adding a simple math challenge or captcha would likely do the job.

    Read the article

  • Links shortener with advanced reporting?

    - by Qualcuno
    I am serching for a script (preferably in PHP) or an external solution which lets me create an "url shortener" with advanced reports. We have been using Google Short Links for a while: it works really well, but it lacks reporting (it only displays a counter with the total number of redirects). Our setup is as follows: "go.mydomain.com" points to the web service, and we can create links such as "go.mydomain.com/product1". What I'm looking for is a similar service (or self-hosted solution) but with advanced reports, so we can track redirects by day, month, etc, distinguish between mobile and desktop users (very important!) and so on.

    Read the article

  • How to allow Google Images search to by pass hotlink protection?

    - by Marco Demaio
    I saw Google Images seems to index my images only if hotlink protection is off. * I use anyway hotlink protection because I don't like the idea of people sucking my bandwidth, i simply this code to protcet my sites from being hotlinked: RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mydomain\.com/.*$ [NC] RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mydomain\.com$ [NC] RewriteRule .*\.(jpg|jpeg|png|gif)$ - [F,NC,L] But in order to allow Google Image search to bypass my hotlink protection (I want Google Images search to show my images) would it suffice to add a line like this one: RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?google\.com/.*$ [NC] RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?google\.com$ [NC] Because I'm wondring: is the crawler crawling just from google.com? and what about google.it / google.co.uk, etc.? FYI: on Google official guidelines I did not find info about this. I suppose hotlink protection prevents Google Images to show images in its results because I did some tests and it seems hotlink protection does prevent my images to be shown in Google Images search.

    Read the article

  • First Project a big one, How much should we charge?

    - by confuzzled
    Two of my cousins and I started a freelance computer repair/web design business just to make some money on the side during college, and received our first major web design project about three weeks ago. Now we've created websites before, but it was mostly for family businesses and have never really charged money, and most of the websites have been static, and don't really require a CMS. This project, however, was a big one (for us anyways). We created a news site that had several categories, we created the banners, we created a classifieds page (not a web app just something static that they control). Several links, a few graphical assets, CSS drop down menu, RSS feed from a different news site, weather, all the normal stuff you would find on a regular news site. On top of that we put in all the usual Joomla stuff (search, Jcomments, Jslide pictures, JCE, etc.). Then we uploaded the first 10 articles they gave us, and we are going to train them how to use Joomla. Now, at first we decided for 700 dollars. I assumed they just wanted a simple blog like website where they can upload articles. But then we had a meeting, and they asked for a lot more. Note: we did not hard code the template from scratch, but customized the gantry framework to fit their needs. We did code quite a bit however. I estimate that we put in about 50-60 hours in total. I'm wondering if 700 dollars is a bit low, this price is definitely not set in stone. Please keep in mind that this is our first project, and we are newbies, please be kind. Thank You!

    Read the article

  • How to target just one search engine and optimise for that

    - by mickburkejnr
    I've been dabbling with SEO a lot in the last 6 months, and one thing that has surprised me is the disparity between Google and Bing in the way they deliver results. A website ranked for a specific keyword/phrase on Google may rank 3rd on the first page, but using the same keyword/phrase on Bing will display the same website but ranked 15th for the exact same keyword/phrase. I came up with the idea to increase traffic to my website by targetting Bing instead of Google for several reasons. The biggest one is that while it's not the biggest search provider, people still use it, and I feel that if other websites have been "neglected" and not optimised for Bing my website would stand a better chance of getting near the top of their search rankings. The question is though how would I do this? A lot of the SEO advice on the internet is generic, but I can't help feeling it's Google orientated for obvious reasons. How could I optimise my website to be Bing friendly, rather than Google friendly? I know it sounds like suicide as I'm taking myself out of the Google mindset, but I feel it could work wonders for traffic to the site.

    Read the article

  • What are the hard and fast rules for Cache Control?

    - by Metalshark
    Confession: sites I maintain have different rules for Cache Control mostly based on the default configuration of the server followed up with recommendations from the Page Speed & Y-Slow Firefox plug-ins and the Network Resources view in Google's Speed Tracer. Cache-Control is set to private/public depending on what they say to do, ETag's/Last-Modified headers are only tinkered with if Y-Slow suggests there is something wrong and Vary-Accept-Encoding seems necessary when manually gziping files for Amazon CloudFront. When reading through the material on the different options and what they do there seems to be conflicting information, rules for broken proxies and cargo cult configurations. Any of the official information provided by the analysis tools mentioned above is quite inaccessible as it deals with each topic individually instead of as a unified strategy (so there is no cross-referencing of techniques). For example, it seems to make no sense that the speed analysis tools rate a site with ETag's the same as a site without them if they are meant to help with caching. What are the hard and fast rules for a platform agnostic Cache Control strategy? EDIT: A link through Jeff Atwood's article explains Caching in superb depth. For the record though here are the hard and fast rules: If the file is Compressed using GZIP, etc - use "cache-control: private" as a proxy may return the compressed version to a client that does not support it (the browser cache will hold files marked this way though). Also remember to include a "Vary: Accept-Encoding" to say that it is compressible. Use Last-Modified in conjunction with ETag - belt and braces usage provides both validators, whilst ETag is based on file contents instead of modification time alone, using both covers all bases. NOTE: AOL's PageTest has a carte blanche approach against ETags for some reason. If you are using Apache on more than one server to host the same content then remove the implicitly declared inode from ETags by excluding it from the FileETag directive (i.e. "FileETag MTime Size") unless you are genuinely using the same live filesystem. Use "cache-control: public" wherever you can - this means that proxy servers (and the browser cache) will return your content even if the rest of the page needs HTTP authentication, etc.

    Read the article

  • Alternative to nofollow: custom 302 url shortener?

    - by Dogweather
    Here's the scenario: lots of blogging platforms make it tedious to insert nofollow into links within the post content. I.e., you need to edit the html, format it correctly, etc. I have a client who posts lots of content with links that should be nofollow'ed, and I thought of a novel way to handle this, since the blogging platform they're using makes it hard: I install a URL shortener web app on the client's domain. The shortener works as normal, except it redirects via 302 instead of 301. The pagerank will therefore stay at the shortener's domain, and not flow on to the target site. Part 2: In order to get the pagerank to collect meaningfully, say on the site's home page, the shortened URLs would be generated like this: /link?12345 instead of /link/12345. And then, the path /link would 301 to the home page. This way, the id is a param, not a path element. And thus, all the incoming shortened links are going to one path, which transfers pagerank to the home page. So that's my idea. I wanted to see if anybody could find problems with it. Thanks!

    Read the article

  • Using only password to authenticate user (no "username" field)

    - by Guy
    I am creating a client access system, to allow manage invoices, make payments, access information about their products and information/functionality alike. Supposedly there are less than 1000 clients. Would there be any security threat to use only password (UUID v4 strings) to authenticate user? My thoughts: There is virtually no probability of collision or success with brute-force attack. http://en.wikipedia.org/wiki/UUID#Random%5FUUID%5Fprobability%5Fof%5Fduplicates User friendly (one click go) It is not intended to be remembered

    Read the article

  • Creating a backup - Rsync - Connection refused (111)

    - by pablofiumara
    I am trying to create a backup of my website for free. I just want to have a backup of my website, including not only all files and the configuration but also the databases. I mean, a full backup. If it can be done automatically, it would be better. I feel there are better ways than using the cpanel to achieve that (actually, I believe sometimes web hosters does not have any cpanel). I read the following on how to do it: Automatically mirror the entire contents and configuration of your main server to a secondary backup server on a completely separate network in a different data centre. Use RSync, FXP, cPanel voodoo, or whatever method you wish to automate syncing. That is why I installed Rsync Daemon which is an alternative to SSH for remote backups. I configured it but the test went wrong. The terminal is showing me this: pablofiumara@pablofiumara-Lenovo-G470:~$ sudo rsync [email protected]::share [sudo] password for pablofiumara: rsync: failed to connect to pablofiumara.com (50.87.147.75): Connection refused (111) rsync error: error in socket IO (code 10) at clientserver.c(122) [Receiver=3.0.9] pablofiumara@pablofiumara-Lenovo-G470:~$ sudo rsync [email protected]::share failed to connect to 50.87.147.7 (50.87.147.7): Connection refused (111) rsync error: error in socket IO (code 10) at clientserver.c(122) [Receiver=3.0.9] What should I do? Is there a better or easier way to achieve what I wish (I mentioned this in the first paragraph)?

    Read the article

  • Too many access denied errors showing in Google Webmaster Tools every day

    - by user2255733
    I get 18,000 access denied error showing in Google Webmaster Tools every day! So strange it shows for URL's with www and not no-www. Fetch as Google works perfectly for pages got that error. Google starts to downgrade my website - impressions have dropped from 35,000 to 18,000. I am using cloud flair CDN and .htaccess mod_rewrite. Any help will be extremely appreciated as I am really loosing control.

    Read the article

  • Suggestions for a Self-serv advertising service

    - by Mystere Man
    I am seeking a self-serv advertising service for my websites, but I have a few restrictions that seem to make what i'm looking for hard to find. Specifically, I want to place "advertise here" links on my pages and allow end-users to purchase advertising on that site, page, and location. These ads will not be part of a national network. Supports multi-tenancy - That is, I have a number of domains using the same "web application" but with customized content per domain. When a customer wants to advertise on a given domain, then the ads will only appear on that domain and on that page of the domain (even though the page name may be the same across multiple domains). Supports fixed ad prices, not just CPC. I need monthly and quarterly pricing regardless of performance. Integrates with OpenX and other ad networks, so that if there is no self-serv on a given zone, it will use national advertising or direct advertising. Shiny Ads has much of this, but i'm looking for alternatives, as their prices are a bit crazy (20%) and can only do PayPal.

    Read the article

  • How can I exclude content in my notifications bar from being indexed?

    - by Liam E-p
    Of course I want my content to be indexed pretty fast by search engines, however not my notifications bar. My notifications bar contains the last 30 changes to content on the site, and I don't want this to show in my SEO meta. As all the notifications are generic, it often doesn't provide any relevant information. As I said the notifications are generic. If an article named "123" was created, it would create a notification that says "Article "123" was created by xxx at 12:00AM". I'm now wondering if this is a content design problem. As only 1/3 of this information is actually relevant to users (the title, what happened). By SEO meta, and irrelevant notification data being shown, I mean this - Basically what I was wondering, is how I could optimise this, so search engines wouldn't show this generic nonsense.

    Read the article

  • Hide folder names or such?

    - by Miller
    Okay, I have a cpanel account with unmetered everything (pay a bit per month), so I wanna host my forum on it etc I have the domains as lets say money.com wordpress.com forum.com As I'll have to put everything in different folders for instance money will be in /m/ wordpress /w/ and forum in /forum/ or something. What I'm saying is, how do I hide the file so it'll look like money.com/m/ is actually money.com ?? I need to hide the folder name the contents are in so I can host multiple sites, therefore the site will look like its the only site on the host so I don't have to add a redirect for it to direct it to the folder? Thanks guys, been trying for a while!

    Read the article

  • Deny access to a folder on hosting server but serve the pages

    - by Sourav
    My hosting server allows to host multiple websites. The directory structure is like this root |_ www.a.com |_ www.b.com |_ www.c.com |_ www.d.com I want to put some PHP files on the www.d.com folder so if some one browse the site from web-browser can get it, but no one can get it's source code [even by loggin in to the root folder]. Is there any way to doing so ? There is a feature called Password protect folder or so, can in help in this case ?

    Read the article

  • Will moving to Facebook/Disqus Commenting lighten the load on my server any?

    - by sublet
    I manage a site that gets about 50 million hits a month. It's a Wordpress site, load balanced over 6 servers, and has a varnish caching system setup. Right now, 95 - 97% of the time, page views hit the cache. The only time it serves up a new page from the server is when a new story is created, or someone is logged in looking at the stories and commenting. What I am trying to figure out is that if I move over to Facebook Comments or Disqus commenting, and get rid of the users entirely, will that lighten the load? I would think it would because the only time you would be hitting the server, and not the cache, is when you're logged in - which only the admins would be. I know it's only 2.5 - 3% but I wasn't 100% sure.

    Read the article

  • Which image sharing websites supports file uploading dynamically via api

    - by KoolKabin
    I have been searching for image hosting website that displays images of a user in a nice and managed way. I want to upload the files to that image hosting website in my account of that website from a page in my website. i.e if i have a website abc.com then user browse my website abc.com. Uploads the file to my website. Now I want to transfer the uploaded file to the image hosting website so that it can be viewed by other users of that hosting website and get better visibility to world

    Read the article

< Previous Page | 180 181 182 183 184 185 186 187 188 189 190 191  | Next Page >