Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 186/528 | < Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >

  • What's the difference between my nameserver and CName settings?

    - by Josh Mcquiston
    I have purchased a domain name(mxsoup.net) through GoDaddy, and it is just parked. In order to set up my custom URL for my SourceAudio site, they give me the following instructions: In order to host your site at a your own URL, we need you to set up some DNS records to point your URL to us. Specifically, we need two CNAME references, one for 'www.mxsoup.net' and one for 'secure.mxsoup.net', both of which should point to 'web2.sourceaudio.com'. But the rep on the phone at GoDaddy said that my site is hosted at HostMonster.com, and therefore I need to talk to them to accomplish this(which is possibly true, but my business owner says he hasn't purchased hosting for this particular domain, yet he does have some other sites in his hostmonster hosting acct.) My GoDaddy account shows that my nameservers are pointing at NS1.HOSTMONSTER.COM, and NS2.HOSTMONSTER.COM, and I can edit those. But is this the same as setting up the CNAME as described above? Any help would be greatly appreciated!

    Read the article

  • How to import mass accounts into iKode Newsletter Server?

    - by Brownsithily Smith
    I have sent out emails to my 6.2k subscribers through iKode Newsletter Server. And about 50 to be considered as spam. It is less than 1%. It is amazing! The web based email marketing software of iKode also provides double opt-in subscription form which is effective to target special audience. However, if I want to import mailing list to this software, I need to add the address one by one, which is a waste of time. Does iKode provides mass account import ability? Or just need to upload a mailing list file?

    Read the article

  • bad practice to create a print friendly page to remove the use of pdfs?

    - by Phil
    the company I work for has a one page invoice that uses the library tcpdf. they wanted to do some design changes that I found are just incredibly difficult for setting up in .pdf format. using html/css I could easily create the page and have it print very nicely, but I have a feeling that I am over looking something. is it a good practice to set up a page just for printing? and if not, is it at least better than putting out a ugly .pdf? I could also use the CSS inline so that if they wanted to download it and open it they could.

    Read the article

  • Letting search engines know that different links to identical pages stress different parts of the page

    - by balpha
    When you follow a permalink to a chat message in the Stack Exchange chat, you get a view of the transcript page for the day that contains the particular message. This message is highlighted in yellow, and the page is scrolled to its position. Sometimes – admittedly rarely, but it happens – a web search will result in such a transcript link. Here's a (constructed, obviously) example: A Google search for strange behavior of the \bibliography command site:chat.stackexchange.com gives me a link to this chat message. This message is obiously unrelated to my query, but the transcript page does indeed contain my search terms – just in a totally different spot. Both the above links lead to the same content, and Google knows this, since both pages have <link rel="canonical" href="/transcript/41/2012/4/9/0-24" /> in their <head>. The only difference between the two links is Which message has the highlight css class?. Is there a way to let Google know that while all three links have the same content, they put an emphasis on a different part of the content? Note that the permalinks on the transcript page already have a #12345 hash to "point" to the relavant chat message, but Google appears to drop it.

    Read the article

  • Wierd Results A/B Test in Google Website Optimizer

    - by Yisroel
    I set up a test in Google Website Optimizer that has a 3 variations - original (A), B, and C. In order to further validate the results of the test, I added a variation C that is exactly the same as the original. And thats where the results get weird. 6 days in to the test, the best performing variation is C. It outperforms the original by 18.4%! How is that possible? Do I now discount the results of this test entirely?

    Read the article

  • Is there any way to discover the traffic of a site I don't control?

    - by George Bailey
    Given the following: The website does not call any external images or scripts, all the content is hosted on a server that is in our control. The website does not contain the meta tag, nor does it contain the html file that would authorize a Google Account access to Webmaster Tools. The access logs have not been provided to any 2nd or 3rd party. Is it possible for a 3rd party to get an idea of how many hits the site is getting, or are they limited to just seeing how high the site ranks? How could the 3rd party determine how well the site is doing under these restrictions? Is there a website for that that you know of?

    Read the article

  • What to do when product range evolves and site name does not reflect this?

    - by nitbuntu
    Suppose, just as an example, I have a website with domain www.gifts-for-dogs.com.....but after a few years I start selling stuff for Cats and Fish. I may not keep enough of a range of products for these other type of pets yet, so can't justify changing the domain name and logo (to something like gifts-for-pets.com) just yet....but envisage that I eventually may have to in the not too distant future. What would be a good strategy here and what are the steps I would have to consider before making these changes?

    Read the article

  • How do I backup my customer's data?

    - by marcamillion
    If you run a SaaS app, or work on one, I would love to hear from you. Where the safety and security of your customer's data is paramount, how do you secure it and back it up? I would love to know your main host (e.g. Heroku, Engine Yard, Rackspace, MediaTemple, etc.) and who you use for your backup. Be as detailed as possible - e.g. a quick overview of your service and the data you store (images for instance), what happens with the images when the user uploads them (e.g. they go to your Linode VPS, and posted to the site for them to see - then they are automatically sent to AWS or wherever, then once a week they are backed up to tape by the managed hosting provider, and you also back them up to your house/office). If you could also give some idea as to what the unit cost (per GB/per user/per month) of storage is - on average, I would really appreciate that. Getting ready to launch my app, and I would love to get some more perspective on the nitty gritty details involved. Thanks!

    Read the article

  • What are the requirements to test a website using jquery.get() ? [migrated]

    - by Frankie
    I am working on a simple website. It has to search quite a few text files in different sub-folders. The rest of the page uses jquery, so I would like to use it for this also. The function I am looking at is .get() for downloading the files. So my main question is, can I test this on my local computer (Ubuntu Linux) or do I have to have it uploaded to a server? Also, if there's a better way to go about this, that would be nice to know. However, I'm more worried about getting it working. Thanks, Frankie PS: Heres the JS/jQuery code for downloading the files to an array. g_lists = new Array(); $(":checkbox").each(function(i){ if ($(this).attr("name") != "0") { var path = "../" + $(this).attr("name") + ".txt"; $("#bot").append("<br />" + path); // debug $.get(path, function(data){ g_lists[i] = data; $("#bot").html(data); }); } else { g_lists[i] = ""; } }); Edit: Just a note about the path variable. I think it's correct, but I'm not 100% sure. I'm new to web development. Here's some examples it produces and the directory tree of the site. Maybe it will help, can't hurt. . +-- include ¦   +-- jquery.js ¦   +-- load.js +-- index.xhtml +-- style.css +-- txt    +-- Scripting_Tools    +-- Editors.txt    +-- Other.txt Examples of path: ../txt/Scripting_Tools/Editors.txt ../txt/Scripting_Tools/Other.txt Well I'm a new user, so I can't "answer" my own question, so I'll just post it here: After asking for help on a IRC chat channel specific to jQuery, I was told I could use this on a local host. To do this I installed Apache web server, and copied my site into it's directory. More information on setting it up can be found here: http://www.howtoforge.com/ubuntu_debian_lamp_server Then to run the site I navigated my browser to "localhost" and everything works.

    Read the article

  • First Project a big one, How much should we charge?

    - by confuzzled
    Two of my cousins and I started a freelance computer repair/web design business just to make some money on the side during college, and received our first major web design project about three weeks ago. Now we've created websites before, but it was mostly for family businesses and have never really charged money, and most of the websites have been static, and don't really require a CMS. This project, however, was a big one (for us anyways). We created a news site that had several categories, we created the banners, we created a classifieds page (not a web app just something static that they control). Several links, a few graphical assets, CSS drop down menu, RSS feed from a different news site, weather, all the normal stuff you would find on a regular news site. On top of that we put in all the usual Joomla stuff (search, Jcomments, Jslide pictures, JCE, etc.). Then we uploaded the first 10 articles they gave us, and we are going to train them how to use Joomla. Now, at first we decided for 700 dollars. I assumed they just wanted a simple blog like website where they can upload articles. But then we had a meeting, and they asked for a lot more. Note: we did not hard code the template from scratch, but customized the gantry framework to fit their needs. We did code quite a bit however. I estimate that we put in about 50-60 hours in total. I'm wondering if 700 dollars is a bit low, this price is definitely not set in stone. Please keep in mind that this is our first project, and we are newbies, please be kind. Thank You!

    Read the article

  • Will many links to the same page without nofollow penalize the host site in the search engine rankings?

    - by Evgeny
    May be a silly question, but I'll give it a shot :). On my forum app I would like to allow users with sufficiently high reputation display links to their home pages under every post - without the nofollow attribute (while lower rep users will have the nofollow) I am happy to help the site contributors improve rankings of their own, but not sure if this can actually deteriorate the rank of the host (the site that hosts those links) - as potentially the same link to the user's home page may be peppered in the pages of the host. What do you think? Thanks.

    Read the article

  • mitigating lost emails when switching provider

    - by sam
    were about to change to gmail from a webmail provided by our hosting provider, i understand changing the mx records and all. But my main worry was if there would be any emails that would fall through the gaps of the two systems during change over. Im not familiar with the ins and outs of how the mx record works, is it like a dns record change, ie. it needs to propagate ? If thats the case would there be a period were its left my current email provider but not switched to the new gmail account ? Thus allowing emails not be delivered or worse lost ?

    Read the article

  • What are the Consequences for using Relative Location Headers?

    - by Alan Storm
    According to the spec, Location headers used in a redirect require a server name HTTP/1.1 301 Moved Permanently ... Location: http://example.com/foo/baz/bar However, in 2012, most web browsers will recognize a relative path and redirect you to the new location using the original server name HTTP/1.1 301 Moved Permanently ... Location: /foo/baz/bar Are there any negative/surprising consequences to using the relative URLs in the Location headers? My particular concern is how Google/search-engines will interpret this, but if there's anything else I'm not thinking about I'd love to hear it.

    Read the article

  • Experienced programmer, beginner at web design, tools for effective maintainable web design? [closed]

    - by Clinton
    I do quite a bit of programming in my work, which I'm comfortable with, but recently I've being trying to do some web-design for non-work related reasons. I've got a Drupal site up and running, and added some content. But they all look fairly basic. Header with some content. It doesn't look particularly polished. Anyway, as an example, what I wanted to do was make some "bubbles", each with some text in them. From a programmers point of view, say: bubble(question_text, answer_text) might expand to a box with some border, with "Question: " + question_text then "Answer: " + answer_text. Of course I'd have lots of these bubbles, but I'd like to change their look and feel in one place, so simple HTML would be a maintainable nightmare. I also want to lay them out on the screen in some fashion. I was thinking a mixture of javascript and CSS, or possibly use PHP which Drupal uses. On the other hand, I fear I might be taking a 1990s approach to this, and that there's actually tools available now that make this process a lot easier. I'm just wondering what the best approach to this sort of task is? Should I be using offline web design software and copying the code to Drupal, and if so, any recommendations? I'm sorry if my question is a bit vague, because I'm not really sure what question I should be asking. I'd appreciate if you answer and comment, and I'll try my best to be more specific as I understand more.

    Read the article

  • Managing products on a an ecommerce site [closed]

    - by John
    I've had a site that sells widgets for many years. I do not inventory my widgets, but the cost of adding them to the site and makings sure the site is current is becoming cost prohibitive. Here are the facts: I sell a single class of widget. I have about 50,000 widgets on my site. I have about 100 vendors that create and dropship the products when they get an order from me via email. Each vendor carries from 50 to 5000 types of widgets. Vendors all have websites with images and descriptions of their products. Each widget is produced in limited supply and usually sell out in 1-5 years. Prices of the widget often go up, sometimes more than 50% before they sell out. My vendors aren't very tech sophisticated. They have websites with their products, but most can't supply an api or database dump. Their websites usually display retail prices to the public, but I login or refer to a price list (usually excel) for wholesale prices. As it stands now, I hire local people to add and describe each widget to our website. It usually takes a person 4 minutes to add a widget to the site. This doesn't include moving to a new vendor. I feel like the upload/edit process is as good as it can get via a form/website. The problem is that it is getting very expensive to upload and keep the widget inventory current. I often get orders for something after it's sold out from the vendor or the price is wrong. This seems like it would be a problem in many industries. Can anyone suggest the cheapest way to upload inventory and ensure prices are current from my vendors? I'm assuming it will involve outsourcing, but I would like ideas on how to setup the compensation model.

    Read the article

  • How to spot a good social media marketer

    - by fiftyeight
    It's a bit subjective but helpful and on-topic IMO, mayebe should be made community wiki I've been in contact with some guest bloggers lately which are interested in publishing posts on my website's blog, so far I've done a regular weekly blog and had a guy marketing it, but he's too busy to do more work right now. Now I need someone to market these blog posts on social media, the last guy I just got by recommendation from a friend, but now I need to find one myself. What criteria should I check when it comes to social media marketers? Do the number of followers and fans on their accounts and/or the number of votes they get for articles they market mean anything, or is impossible to know if it's spam? That's about the only criterion I could think of so far... Thanx to anyone who helps

    Read the article

  • SEO: Is there a limit to how long titles/descriptions should be?

    - by jiewmeng
    I am trying to fill up title and meta description for my Tumblr blog. The way I will do that is via the themes. For the title, it isn't too much of an issue, I can just get the title for the post although some post types do not have titles. The main question is about the description. I am thinking of using the start of the body content. For now, there is no way to limit the length of body content. I wonder if its OK to have the whole body in the description? I have contacted Tumblr support to suggest that they have a way of creating limiting text length in their template tags.

    Read the article

  • can canonical links be used to make 'duplicate' pages unique?

    - by merk
    We have a website that allows users to list items for sale. Think ebay - except we don't actually deal with selling the item, we just list it for sale and provide a way to contact the seller. Anyhow, in several cases sellers maybe have multiple units of an item for sale. We don't have a quantity field, so they upload each item as a separate listing (and using a quantity field is not an option). So we have a lot of pages which basically have the exact same info and only the item # might be different. The SEO guy we've started using has said we should put a canonical link on each page, and have the canonical link point to itself. So for example, www.mysite.com/something/ would have a canonical link of href="www.mysite.com/something/" This doesn't really seem kosher to me. I thought canonical links we're suppose to point to other pages. The SEO guy claims doing it this way will tell google all these pages are indeed unique, even if they do basically have the same content. This seems a little off to me since what's to stop a spammer from putting up a million pages and doing this as well? Can anyone tell me if the SEO guy's suggestion is valid or not? If it's not valid, then do i need to figure out some way to check for duplicated items and automatically pick one of the duplicates to serve as an original and generate canonical links based off that? Thanks in advance for any help

    Read the article

  • Category to Page and blocking category url via robots.txt -Good for SEO?

    - by user2952353
    I am using a template which in the pages it allows me to add sidebars / more content under and above the content I want to pull from a category which is very helpful. If I create pages to display my categories content wont the page urls go in conflict with the category urls? By conflict I mean causing a duplicate content error? What I thought might help was to block from robots.txt the category urls of the blog ex. /category/books /category/music Would that be a good practice in order to avoid the duplicate content penalty? Any tips appreciated.

    Read the article

  • Repeating keywords in inbound links

    - by JJ_Jason
    Hy. I have a service similar to bit.ly. The link generation method is similar but the site is not. A user uses my site just like the mentioned bit.ly, but i offer a differnet kind of service for which i would want to rank (on Google) for. If i were to generate links such as: mysite.com/my-keywords/1Asdf34 would it be considered spammy or black hat? The same for bit.ly would be: bit.ly/url-shortening-services/3k1dS4sd For bit.ly it would defeat the purpose, but url length in my case does not have to be short.

    Read the article

  • How can I make my binary file is served as binary and not text when user choose "Save Linked File As..." in Safari?

    - by Eonil
    I'm serving a binary file (.IPA) with Ubuntu/Apache 2.2. When I have chosen Save Linked File As... in Safari, it says it's text file. And it guides me to add .txt extension. However it does not add any extra extension when I download it just clicking link. I added line AddType application/octet-stream .ipa in apache configuration file. I don't know what's wrong with this. Is this a bug of Safari or my misconfiguration? (1) If it caused by bug, how can I avoid this? (2) Or if it caused by misconfiguration, what should I do?

    Read the article

  • Does your company name in an article title damage Search Engine relevance

    - by user492681
    I've been wondering about this for a while but never come across a solid answer. Many websites include their name in all the title tags of their articles. This is often apparent in word-press blogs etc. eg: Tsunami hits Japan and leaves thousands homeless | My Website Name The issue I have is that Search engines strip the stop words out of this sentence to leave the words in which it compares to the body text. So if I want my article to rank well and be relevant, in this case about the terrible Tsunami that has recently struck Japan what is to STOP the MY WEBSITE NAME section of the title de-valuing the relevance of the article. Am I over-worrying? Or should I take this in to consideration? Thanks for advice in advance.

    Read the article

  • What should a page's minimum word count be in order to be effectively indexed?

    - by ZakGottlieb
    I'm seeding a new site with hundreds of (high quality) posts, but since I am paying per word written, I'm wondering if anybody in the community has any anecdotal evidence as to how many words of content there should now be for a page to be counted just the same as a 700 word+ post, for example? I know there are always examples of pages ranking well with, for instance, 50 words or less of content, but does anyone have any strong evidence on what the minimum count should be, or has anyone read anything very informative in regards to this issue? Thanks a lot in advance!

    Read the article

  • How to show the right country domain in Google Places?

    - by Baumr
    Background A site has multiple ccTLDs: example.com for the US, example.co.uk for UK users, example.de for Germans, etc. Googling for certain city keywords will return rich snippets with a list of Google Places: Problem When searching on Google Germany, the domain for US users (example.com) appears instead of the corresponding ccTLD (example.de). This is not good user experience, as users would most likely like to book on a site localized for them (e.g. language and currency). Question What solutions are there? Is it possible to return different ccTLDs in rich snippets for Google searches in Germany/UK? Ideas Would implementing the hreflang annotation resolve this? What about entering multiple corresponding URLs in the structured data markup?

    Read the article

  • Buying a custom domain for blogger

    - by John Demetriou
    I am about to move my blogger site to a custom domain. I do all the steps as told but whenever I find the perfect custom domain (that is free) I get redirected to google apps for bussines... Is it a necessity to get Google apps for business before buying a custom domain? If I only start a free trial of Google apps for business when the trial period expires will my custom domain domain still be valid?

    Read the article

< Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >