Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 248/528 | < Previous Page | 244 245 246 247 248 249 250 251 252 253 254 255  | Next Page >

  • Fresh start outside Google's crapbox [on hold]

    - by Krzysztof Minister Bytu
    I might have been experimenting with my website too much and Google first cut the flow of visitors considerably and now I didn't get one for 4 days already. It's a joke that they've done this, because I've put a lot of work into it, but that's a topic for another day. My question is about further avoiding it. I want to take the partly improved design from that website onto a new one and get a new domain name. The question is: in that case, do I have to change the hosting option (it has my old website name in the address), or is changing the domain enough for Google to treat it as something new from a "fresh user". In other words, does Google get through the domain address and log into the actual hosting address? I'd hate to waste another few months of hard work, so I prefer to take every possible precaution but not paying for another hosting would make things easier on the wallet.

    Read the article

  • how does private sales ecommerce site work on their SEO?

    - by 142857
    In a private sales ecommerce site, users need to sign up/in before they can access the pages of website. So, even if a user tries to directly navigate to a product page, he is redirected to sign in. I am wondering then how does these sites manage their SEO, as it would imply google too can't crawl these pages, or do they completely ignore the SEO benefit of allowing google to crawl the product and catalogue pages?

    Read the article

  • What is the correct frequency of changing content regularly?

    - by SSRB
    What is the correct frequency of changing content regularly? Suppose I have a site "Seven Sea" having 5 links name as Home, About Us, Product, Sitemap, Contact Us. It is good for site to change the site content regularly. But is there is any minimum and maximum frequency for do this job. Suppose I do change my content daily then is that good for SEO point of view. OR suppose I change my content once in a year Is that bad for SEO. What is best or more better choice? A REQUEST: If this type of question already answered then give me that answered link and do not close the question.

    Read the article

  • How to add SMS text messaging functionality to my website?

    - by jessegavin
    I want to add the ability to send reminders to people via email and SMS for specific events that they have signed up for on a web application that I am building. The email part is not difficult, but I am wondering where to find a good solution for sending SMS messages. It would also be a plus if this solution allowed two-way SMS communication with my web application so that people would be able to reply with a CONFIRM or CANCEL type of a message. Has anyone implemented something like this? Does anyone know of good tools out there? EDIT: I am realizing that this is more of a "lots of ways to skin this cat" type of question and so I changed it to community wiki.

    Read the article

  • Django - need to split a table across multiple locations [closed]

    - by MikeRand
    Hi all, I have a Django project to track our company's restructuring projects. Here's the very simple model: class Project(models.Model): code = models.CharField(max_length=30) description = models.CharField(max_length=60) class Employee(models.Model): project = models.ForeignKey(Project) employee_id = models.IntegerField() country_code = models.CharField(max_length=3) severance = models.IntegerField() Due to regulations in some European countries, I'm not allowed to keep employee-level severance information in a database that sits on a box outside of that country. In Django, how do I manage the need to have my Employee table split across multiple databases based on an Employee attribute (i.e. country_code) in a way that doesn't impact anything else in the project (e.g. views, templates, admin)? Thanks, Mike

    Read the article

  • Tracing and blocking ip's

    - by Mark
    I was wondering if there is anyway you can block or exclude people from submitting multiple fake ip's online . For example I have a google pay-per-click campaign set up on adwords. I downloaded a program in minutes which enabled me to hide or give out a fake ip address which also allows you to use a different ip each time. I tried this out on my own link on google which in turn got through adwords and I was charged for the click. My question is how would you be able to counter or block someone who continuosly clicks on your link with a different fake ip each time?

    Read the article

  • GZipped Images - Is It Worth?

    - by charlie
    Most image formats are already compressed. But in fact, if I take an image and compress it [gzipping it], and then I compare the compressed one to the uncompressed one, there is a difference in size, even though not such a dramatic difference. The question is: is it worth gzipping images? the content size flushed down to the client's browser will be smaller, but there will be some client overhead when de-gzipping it. Please advise.

    Read the article

  • Does e-commerce platform matter for customers

    - by c s h
    The place I work is now looking into developing a new e-commerce site on the Magento platform. Magento will fill all of our needs. I was just wondering if it is in anyway unprofessional doing it this way (Impression is something we are really worried about), will people who visit the site look at our business different knowing we used Magento or any other e-commerce platform. There are ways to find out. I use Chrome Sniffer to find out what platforms are used to develop each site, there are other tools available for different browsers.

    Read the article

  • Is it possible to tell a search engine not to index a specific section of an HTML page? [closed]

    - by Justin
    Possible Duplicate: Preventing robots from crawling specific part of a page I know you can use robots.txt to ignore entire pages or sections of your site, but is there a way to tell cralwers like the Googlebot to ignore specific sections of an HTML page? I found this blog post that discusses one method, but it appears only to work for the Google Search Appliance, not the Googlebot. Is there some method for at least Google for to do this?

    Read the article

  • How should I track multi-valued page attributes (e.g. tags) using custom variables?

    - by Simon
    Our pages can each have many tags, e.g 'football', 'sms', 'nsfw', etc.. wich we would like to track in google analytics. We're already tracking things like category using google analytics custom variables. We've used three of the five available slots so far. How can we track tags the same way? If we just mush them all together - e.g. 'football, sms, nsfw' then can we track the ones that are tagged 'football'? What's the right way to track multi-valued page attributes using custom variables?

    Read the article

  • MachForm, RackForms(FromBoss), Form Tools [on hold]

    - by user39200
    This is my first question here. I am looking for a form builder solution that will allow rapid form development even by beginners, without needing much knowledge of programming. I prefer a solution that we can install on our servers. I have 3 products shortlisted for further review: MachForm, RackForms(formerly FormBoss) and Form Tools If anyone has used any of these products, I would appreciate your input on your experience with using these products and your recommendations. Does anyone know of other products that I must be sure to include in my review? We will eventually move over hundreds of forms to the chosen solution and I would like to make sure as much as I can that it will still be a viable solution in the coming years. Looking forward to everyone's feedback!

    Read the article

  • Z-index preventing on hover attribute on another element [migrated]

    - by user18294
    I have two different elements (div class="") within a larger container. Let's call them div class="overlay_container" and div class="title." The div class="overlay_container" has a subclass, .image, which creates an overlay over the entire larger container on hover. The div class="title" has a z-index of 10,000 and lies over .image and therefore over the overlay. Unfortunately, when you hover over "title," the subclass overlay image underneath disappears. I know the problem is obviously that the "title" div is right over the other divs and therefore the on hover will disappear due to the z-index. But how do I fix this? How do I make it so that when you hover over the "title," the .image overlay still appears? If your answer involves jQuery, could you please tell me where to put the script (before the /head tag)? Thanks!

    Read the article

  • How to Keep SEO Score from Dropping with Duplicate Content

    - by joeh0717
    I'm hoping that someone has a solution for what I'm trying to accomplish. I'm working on a travel agency web site and there's a "Overview" section for each cruise line. These overviews are located on the index page for each cruise line. Here's my issue: The company is creating a search engine that includes details on each cruise line. Their write-ups on each cruise line are great, so I'd like to include the overview they created for each cruise line, rather than having to create all new ones. However, I don't want duplicating their content to negatively affect the SEO scores of the pages they originally put this content on. It's gong to duplicate, since each page that's dynamically generated by their search engine is going to include a section about the cruise line (where I'd want to place the overview). Question: Is there any way that I can include these overviews (ideally, copying the exact HTML that they've already implemented) without the search engines indexing those particular code sections? I'd want the rest of the search result pages to be indexed...just not the section of each page that contains this duplicate code. I saw something about using a span class named robots-nocontent in Yahoo (not sure if this also applies to Bing) and googleon / googleoff tags in Google. Is this the best solution? I'm open to any suggestions, thanks!

    Read the article

  • How can I redirect any kind of access to my site to somewhere else, except for me?

    - by Omega
    My site, www.example.com has a MyBB forum installation, and has many users constantly checking it out. I would like to make a major change to it - basically, I want to delete everything it has and start clean. However, this rebuilding process may take me hours since I need to do a lot of things. Thus, whenever an user tries to enter any part of my site (www.example.com, www.example.com/blog etc), the user will always be redirected to www.example.com/underconstruction so they can see a nice HTML site explaining the current situation. But not only that - while I am configuring my site, I need to visit it myself to make sure things do work live. So basically, it has to redirect everyone who isn't me. Is that possible in any way? I am using GoDaddy hosting.

    Read the article

  • Mobile web app, styling in percentages; I can't get height to work [migrated]

    - by Mick79
    I am building a mobile app for a band and obviously want it to display well in all the plethora of handsets out there today. I built it at first for my own device and it looks and works great, so now I am reworking it in percentages so that it works in all devices. I have a slider (jquerytools) going on and if i set width to 100% then it is perfectly wide in my iphone and my ipad... success, however I am not having any luck with height. It seems to only accept a height in px. If i set a height in percent it just doesn't display. any ideas? #header{ width:100%; height:198px; position:relative; z-index: 20; box-shadow: 0 0 10px white; } .scrollable { /* required settings */ position:relative; overflow:hidden; width: 100%; height:100%; box-shadow: 0 0 20px purple; z-index: 20; } .scrollable .items { /* this cannot be too large */ width:500%; height:100%; position:absolute; clear:both; box-shadow: 0 0 30px green; } .items div { float:left; width:20%; height:100%; } /* single scrollable item */ .scrollable img { float:left; width:100%; height:100%; /* height:198px; */ } /* active item */ .scrollable .active { border:2px solid #000; position:relative; cursor:default; } `

    Read the article

  • Google Analytics custom variables and how are they recorded?

    - by mrtsherman
    I have been asked to add GA custom variable tracking to my company's website. The company website uses server side includes, so making modifications to the tracking code happens identically everywhere. Maintenance is therefore a headache. Also, GA takes about twenty-four hours for custom variables to start showing up in reports and that makes troubleshooting a headache. So if you have custom variables // visitor level tracking, id = 12345 _gaq.push(['_setCustomVar', 1, 'id', '12345', 1]); // page level tracking, email = [email protected] _gaq.push(['_setCustomVar', 1, 'email', '[email protected]', 1]); The marketing people want the following out of this: User visits site and we record a unique id for them. Whenever they return this id will be used in GA. User signs up for our newsletter on page X and we record their email address. Whenever they return this email address is used in GA. Now a big problem for me is that I don't use GA and the marketing people don't use custom variables. So we don't actually know how this will work. Do I want Page, Session or Visitor level tracking? What happens because the same GA code is used on every page? If they visit the email sign up form and we record the email address, but then they go somewhere else where email is nonexistent will the value get 'overwritten.' Sorry for the long question, but there are a lot of unknowns for a GA noob.

    Read the article

  • Using Minified Page Specific JS [migrated]

    - by Mike C
    I've been working on a rather large scale project which makes use of a number of different pages with some very specific Javascript for each of them. To lessen load times, I plan to minify it all in to one file before deploying. The problem is this: how should I avoid launching page specific JS on pages which don't require it? So far my best solution has been to wrap each page in some additional container <div id='some_page'> ...everything else... </div> and I extended jQuery so I can do something like this: // If this element exists when the DOM is ready, execute the function $('#some_page').ready(function() { ... }); Which, while kind of cool, just rubs me the wrong way.

    Read the article

  • Space in img:s "ALT" attribute good/bad for search engines?

    - by Camran
    I am trying to make it easier for search engines to crawl my website, as it is almost 100% dynamic. I have a couple of transparent images which are actually links to sections of my page. I wonder, if I add an "alt" attribute containing space characters to explain the target, will this improve SE rankings etc? For example: <img src="blabla.png" alt="post new classified"> Or will this just result in errors? Ànd, what should I put in the alt attribute if I can't use space? PS: Another different and short question, will javascript-rich content make a page less important to crawlers? Thanks

    Read the article

  • IIS and content caching

    - by JayC
    I'm a web developer and administer of a Windows 2008R2 Could Instance with IIS 7. I recently made an update to our website, but when I revisited the website, the website was being viewed with old stylings. I did a refresh (shift + reload button in Firefox) and of course the website displayed as it should. I didn't worry about it, until my client had the same issue in Safari. So, my question, in general, is, how do I prevent this from happening again, and yet still afford some caching of our site? I noticed we did not have content expiration set up on our webserver sites, so I've set that up, but did I really need to? I've also looked at Etags, and, honestly, it's hard for me to know whether or not I should use them or not. One comment I read somewhere there isn't really any issue with Etags scenarios in IIS (even in webfarms)... but, I dunno. Anybody have any suggestions, links, info? Thanks.

    Read the article

  • why my website not ranking in first page of google? [on hold]

    - by India SEO Analyst
    Iam handling website www.usamovingandstorage.com and targeting keywords "chicago movers", but my website is on third page. But my website has nice backlink, and recently i removed irrelevant backlinks also. I compared my competitors' websites such as www.ampolmoving.com, www.chicagomovers.com and have no such big backlinks, but they are ranking first page in google. I compared the three websites in www.opensiteexplorer.org. In that my site has good results. Then How is it happened? I need full comparison, why my site is ranking in third page? what are the actions i need to take to rank in first page.

    Read the article

  • What happens when you close an Adsense account?

    - by rakibtg
    I need to change my payee name, I have asked in Google Adsense product forum one of top contributor replied me: "You will have to close the account & apply again with using your real payee name. That's why they specifically state that the payee name needs to match the full name on your bank account." https://support.google.com/adsense/answer/47333?hl=en This makes sense, but got few question because the support page do not have sufficient content to help me. My questions are: What happens when you close your Adsense account? If I apply again, then what will be the process to re-gain my account? I mean should I have to apply for a website again, then Adsense team will review and approve that? Is there any chance to disapprove my account? What about current check? I have two check in my hand. So, is Google will send those check again to me with my new payee name? Anyone experienced this problem? I have asked it on Google Forum but got no answer!

    Read the article

  • How can cloudfare obfuscate IPs? [on hold]

    - by Sharen Eayrs
    I have several domains. I do not want those domains to be linked together. For example, say I have domaina.com. I do not want other users to be able to see that domaina.com is hosted with domainb.com A way to do it is to use many shared hosting service. Sometimes shared hosting is not powerful enough. Another way is to use multiple VPS, which is acceptable. My VPS provider says that I can accomplish this with cloudfare. However, I am confused on how cloudfare can hide my IP

    Read the article

  • Content light website and Google - Tell google it's a listings site (as opposed shop, reviews or restaurants)

    - by Doug Firr
    I have a listings style website. Due to the nature of this (listings) the site is content light. Each page is typically less that 50 words but there are many pages. The site in question has had a ton of media coverage and so has some great inbound links from places like Wired, Fast Company, Canada Broadcasting Corporation and many many other bloggers, media websites and recycle related niche authors (It's a recycling site). But Google really ignores it. Traffic from search is very very low - less than 5% of all traffic. I know that using markup you can tell Google whether your site is a restaurant, article, review, shop, local business and a few other categories (https://www.google.com/webmasters/markup-helper/u/0/). Is there a way to tell Google that my site is a listings site? I suspect, but do not know for sure, that part of the problem is that Google simply does not know what my site is? It's a crowdmap where people post curbalerts. The information is useful to people but it is presented in a short, concise way - a pin on a map, a picture and a short description. Adding anything further is not necessary for the site's intended purpose. 1st question - how best to tell the search engines what y site is - listings and not some spammy website? Any recommendations in improving our site's Search presence? You can take a look here if interested: http://tinyurl.com/lxg4hn7

    Read the article

  • How can I fix the #c3284d# malvertising hack on my website?

    - by crm
    For the past couple of weeks at semi regular intervals, this website has had the #c3284d# malware code inserted into some of its .php files. Also the .htaccess file had its equivelant code inserted. I have, on many occasions removed the malicious code, replaced files, changed the ftp password on my ftp client (which is CoreFTP), changed the connection method to FTPS for more secure storage of the password (instead of plain text). I have also scanned my computer several times using AVG and Windows Defender which have found no malware on my computer which might have been storing my ftp passwords. I used Sucuri SiteCheck to check my website which says my website is clean of malware which is bizarre because I just attempted to click one of the links on the site a minute ago and it linked me to another one of these random stats.php sites, even though it appears I have gotten rid of the #c3284d# code again (which will no doubt be re-inserted somehow in an hour or so).. Has anyone found an actual viable solution for this malware hack? I have done just about all of the things suggested here and here and the problem still persists. Currently when I click on a link within the sites navigation menu within Google Chrome I get googles Malware warning page: Warning: Something's Not Right Here! oxsanasiberians.com contains malware. Your computer might catch a virus if you visit this site. Google has found that malicious software may be installed onto your computer if you proceed. If you've visited this site in the past or you trust this site, it's possible that it has just recently been compromised by a hacker. You should not proceed. Why not try again tomorrow or go somewhere else? We have already notified oxsanasiberians.com that we found malware on the site. For more about the problems found on oxsanasiberians.com, visit the Google Safe Browsing diagnostic page. I'm wondering if it is possible that the Google Chrome browser I am using has itself been hacked? Does anyone else get re-directed when clicking links on the the website?

    Read the article

< Previous Page | 244 245 246 247 248 249 250 251 252 253 254 255  | Next Page >