Search Results

Search found 9727 results on 390 pages for 'llblgen pro'.

Page 77/390 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • Numbered list with subclauses

    - by Barry Clearwater
    I'm trying to create a legal document with decimal numbered subclauses, then alpha and roman subsub and subsubsub clauses. (whew!) `1. MAIN HEADING 1.1 This is an example of a sub-clause and you can see that even though the words continue on to the right, it would be best if it wrapped around and formed a block to the right of the decimal number 1.2 In doing so the normal second clause should also wrap around but the second subsequent clause should hang in from the left and be in a block. See below for the remaining clauses (a) this list is completely for demonstration and should not be construed as legal language in any way, nor should make sense in that (b) should the indentation take more than: i) this many lines it would be overly big 11) legal numbering continues in the sub-sub clauses with the use of lower roman lettering and should flow below in a block iii) and continue the formatting on to the next line but be underneath the body of the the text and not begin directly below the number itself. In this example the text carries out to the right but I need it to wrap around underneath. Sorry its so wordy, need this to show the example. 2. Second Clause Heading 2.1 and so it continues thus I've found the examples for decimal numbering but they do not create a block out to the right of the number, and they carry on with multiple decimals rather than alpha and roman sub clauses.

    Read the article

  • Google Webmaster Tools, DNS Errors & HostPapa

    - by Gravy
    Received a message from Google Webmaster Tools: Over the last 24 hours, Googlebot encountered 2 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 40.0%. You can see more details about these errors in Webmaster Tools. Recommended action Contacted HostPapa and they deny that there is any issue with the site / server!!! Support in terms of what I can do to actually resolve this issue is non-existent!!!! The site is currently online. And I don't know much about DNS... so any advice about what I can do to resolve this problem would be much appreciated. Basically, the message from Google says that it is my webhosts fault, the message from my webhost (HostPapa) is... "Just tell google to crawl your site as there are no errors."

    Read the article

  • URL blocked in robots.txt but still showing up on Google search [closed]

    - by Ahmad Alfy
    Possible Duplicate: Why do Google search results include pages disallowed in robots.txt? In my robots.txt I am disallowing a lot of URLs. Google webmaster tools says there're +750 URL blocked. The problem is the URLs are still showing on Google search. For example I have the following rule: Disallow: /entity/child-health/ But when I search some-keyword + child health the following URL shows up : http://www.sitename.com/entity/child-health/ Am I doing anything wrong? Is is possible for a URL to be blocked using robots.txt and still show up on search results?

    Read the article

  • Chmod 777 to a folder and all contents on Apache web server

    - by Ryan Murphy
    I have just got new hosting for my website and I have a directory /www which I put all my website files within and a folder in that directory called 'store'. Within 'store' is several files and folders, I want to give the folder 'store' and all files and folders within the 'store' folder all permissions. How do I do this? I am guessing via .htaccess. I have tried inserting chmod 777 -R /store Into the .htaccess file but didn't work. Threw a big on screen error at me. I want to make all the files and folders within /store writable.

    Read the article

  • Does Google submit HTML forms?

    - by Saeed Neamati
    I have a web page, say http://domain/purchase and in this page, I have a web form. User, on submitting this form (which has validation, both client-side and server side and won't be validated until fields are filled appropriately), would be redirected to another page, where (s)he can choose other things, and specify other settings and then purchase our product. Say the second page is http://domain/options. So, user comes to our site and visits http://domain/purchase, fills the form, submits it, and then would be redirected to the second page, http://doamin/options?parameter1=value1&parameter2=value2, which contains parameters from the first page. This is very common in passing parameters between web pages (or technically, between URLs). Now I was reviewing my website, and saw that Google had indexed some of my redirected web pages and URLs, like: http://domain/options?parameter1=value1&parameter2=value2 http://domain/options?parameter1=value3&parameter2=value4 http://domain/options?parameter1=value5&parameter2=value6 http://domain/options?parameter1=value7&parameter2=value8 http://domain/options?parameter1=value9&parameter2=value10 This means that Google Bot has visited our http://domain/purchase page, and has filled our form, and has submitted it, and was being redirected to the other URL, with corresponding parameters. This is the only way that makes sense to me. Does Google really fills forms? PS: All parameters are meaningful, meaning that they are not filled arbitrarily. For example, the phone parameter in indexed pages has correct phone numbers. How is it possible?

    Read the article

  • DNS and WIldcards

    - by Thomas Chapman
    Whenever I attempt to make a record for *.schneiderdonnelly.com.au and CNAME it, I get two errors: You can't mix CNAME/MX records together using the same hostname. Domain root's cannot be CNAME's, however you can web-forward this record to www.schneiderdonnelly.com.au instead for the same effect. I've read it's possible so why can't I make it work? I donated $5 to be a premium member and I've been trying to make it work for yonks. http://i.stack.imgur.com/D9Ui5.jpg This is how I want it to appear. The last record. I am prepared to swap DNS providers as long as they're free.

    Read the article

  • How to reload google dfp in ajax content? [migrated]

    - by cj333
    google dfp support ads in ajax content, but if I parse all the code in main page. it always show the same ads even turn page reload the ajax content. I read some article from http://productforums.google.com/forum/#!msg/dfp/7MxNjJk46DQ/4SAhMkh2RU4J. But my code not work. Any work code for suggestion? Thanks. Main page code: <script type='text/javascript'> $(document).ready(function(){ $('#next').live('click',function(){ var num = $(this).html(); $.ajax({ url: "album-slider.php", dataType: "html", type: 'POST', data: 'photo=' + num, success: function(data){ $("#slider").center(); googletag.cmd.push(function() { googletag.defineSlot('/1*******/ads-728-90', [728, 90], 'div-gpt-ad-1**********-'+ num).addService(googletag.pubads()); googletag.pubads().enableSingleRequest(); googletag.enableServices(); }); } }); }); }); </script> album-slider.php <!-- ads-728-90 --> <div id='div-gpt-ad-1**********-<?php echo $_GET['photo']; ?>' style='width:728px; height:90px;'> <script type='text/javascript'> googletag.cmd.push(function() { googletag.display('div-gpt-ad-1**********-<?php echo $_GET['photo']; ?>); }); </script>

    Read the article

  • To have multiple sites with similar content on the same server but different IPs?

    - by Eugene
    I have 7 different sites on the same dedicated server. Two major sites have different IPs, and 5 small share the same IP. About first 2 they have similar content, but not the same. Basically they have different: URLs Titles Meta Data But they both have the same niche. I was thinking about two strategies. One is to move one of the sites (from 2 first one) on different server or move 5 other sites on different server. But I'm not sure which way is better. My questions are: Wouldn't be better from SEO stand point, to move those 2 sites on different servers? Does it worth it to spend for additional server? Do you know if Google penalize sites for similar content on the same server with different IPs?

    Read the article

  • Replace %26 in htaccess to %2526

    - by Patrick
    I would like htaccess to rewrite example.com/something_%26_else into example.com/something_%2526_else. I'm importing a bunch of pages that have ampersands in the title from Mediawiki. These are encoded as %26. Drupal, for various reasons, has decided double encode the url it to have it become %2526. I simply can't create the alisis within Drupal so I have to use htaccess This is what I have as my rule so far as RewriteRule ^w/([^%26]+)\%26(.*)$ w/$1\%2526$2 [R=301] I asked this question three months ago on stackexchange and was not able to get it working. I tried hiring a contractor for this but was unable to find one. So this my last ditch effort before I completely give up. I really appreciate the help. All the best, Patrick

    Read the article

  • Singular or Plural Nouns as file names for better Search & SEO friendlyness? [closed]

    - by Sam
    Possible Duplicate: Should I use singular or plural nouns in a domain name and why? Dear folks, two scenarios where file names should be best representing the search volume by audiences searching for it. Scenario 1 website.org/en/logo.php website.org/en/brochure.php website.org/en/poster.php website.org/en/design.php OR Scenario 2 website.org/en/logos.php website.org/en/brochures.php website.org/en/posters.php website.org/en/designs.php Q1. What do you intuitivly think would be the best? Q2. What do the facts in general show? people search for singular or plural in search? Q3. Do Search engines have common rule of thumb for this? Q4. Should I pick either and go with either scenario consistently or does it depend on the word? Thanks very much for your ideas/suggestions. I reall don't know which one to go for.

    Read the article

  • Need assistance for domain forwarding

    - by Yusuf Andre
    Briefly my question is about combining my registered domain with the websites listening on port 80/8080 on my server. I have a web server IIS on windows 7 and two web sites listening on port 80 and 8080. I have successfully forwarded any incoming request to port 80 and 8080 to my web server. So everything works like a charm when I try to access these websites entering http://myglobalip:80/Index.aspx or http://myglobalip:8080/Index.aspx from a computer outside of the local network. So I have a domain registered, lets say www.mydomain.com. What steps should I follow in which sequence? What should I consider to do? I need a step by step guide to follow. I have registered my domain on godaddy's website and only configured forwarding so the domain forward to my webserver but when I attempt to access the web page, It always try and try until It times out.

    Read the article

  • Forum software alternative to phpBB3

    - by Fernando
    I've been using phpBB3 for quite some time now. It seems to me this forum software hasn't evolved at all in all these years. Installing mods is a hassle, updating it to a newer version a real pain in the arse and moderating is not intuitive at all. Besides, I find there's just no way to stop spam on it. Lots of web software have made a great job controlling spam, but phpBB3 still doesn't, at least not without too much complex and tedious work. Since my last attempt to update to the latest version broke it, I'm finally fed up with it, and decide I'm not wasting a minute more in mantaining such a beast. I'm looking for a free software (free as in free beer and free as in free speech) alternative. So SMF is not an alternative at the moment. The most important feature I'm looking for is there must be a script to migrate all of the current phpBB users and posts into the new system. Out of all the alternatives out there, does any of them support these features? Which one do you recommend?

    Read the article

  • Apache authentication, security exceptions and safari

    - by Purcell
    I have apache authentication set up on a site, it works fine in firefox and chrome, you type in the username/pass once and then you can happily visit any page on the site. Unfortunately this is not the behavior in safari. Every time you go to another page, you must re-enter your credentials. Is there some way I can look at the security exceptions for safari and set it to always trust the certificate or find some other setting to not ask for authentication on each page?

    Read the article

  • My VPS cannot send email

    - by ifdion
    Webmaster Newbie Question I have a low end vps (128MB RAM)running on Debian.I used a bash script by ilevkov to setup the site. After some trial and error, I managed to set up a WordPress on it. Just now I found out that my VPS can't send any email. I tested using the WordPress reset password email, and it shows The e-mail could not be sent. Possible reason: your host may have disabled the mail() function... After some Google session I noticed that I can send email from ssh. So I tried mail [email protected] Subject: Halo dion some message . and the result said EOT /usr/lib/sendmail: No such file or directory "/root/dead.letter" 9/243 . . . message not sent. The question How can I fix my VPS mail setting?

    Read the article

  • Facebook Comments and page SEO

    - by Gaurav Gupta
    Facebook's recently launched commenting system for blogs loads comments in an iframe, instead of loading them inline. Since blog comments can often contribute significantly to the page's SEO, is it a good idea to use Facebook's system on my blog? Or, does Google recognize iframe content as a part of the page and treats it as such? (It's noteworthy that Disqus.com does not use iframes and loads all comments inline)

    Read the article

  • Running Non-profit Web Applications on Cloud/Dedicated Hosting [closed]

    - by cillosis
    Possible Duplicate: How to find web hosting that meets my requirements? I often times build web applications purely because I enjoy it. I like building useful tools or open source applications that don't come with a price tag. That being said, many of these applications can be quite complex requiring services beyond shared hosting (ex. specific PHP extensions). This leaves me with two options: Make the web application less complex and run on shared hosting. Fork out money for cloud or dedicated/VPS hosting. Considering the application is free (I don't make money off of it intentionally), the money for hosting comes out of my own pocket. I know I am not alone in this sticky situation. So the question is, what are the hosting options that provide more advanced features such as shell access via SSH, ability to install specific software/extensions (ex. if I wish to use a NoSQL DB such as Redis, MongoDB, or Cassandra), etc., at a free or low price point? I know free usually equates to bad/unreliable hosting -- but it's not always the case. There are a couple providers with free plans I know of: Amazon EC2 - Free micro-instance for 1 year AppHarbor - Cloud based .NET web application hosting w/ free plan. What else is available for hosting of non-profit applications?

    Read the article

  • extra configuration needed after installing SSL certificate?

    - by ptriek
    We recently developed two rather simple PHP applications for AXA (European bank). URL's are axa.tfo.be/incentives/cipres and axa.tfo.be/incentives/zrkk (access to both sites is restricted to visitors with cookies with encrypted passwords) On a previous security audit by an external company several security issues have been found. All these issues have been solved by a collleague PHP developer. However, one last requirement has been added - all data should be transfered over https. My php collegue is on holiday, however - and unavailable at the moment. So I contacted my host, and asked for installing SSL certificate. I myself have no knowledge/experience with SSL, so I'm a bit at loss for the following problems. Comodo SSL certificate + unique IP address has been installed today by my webhost for subdomain axa.tfo.be (by www.combell.be). However, it doesn't seem to be working. I posted a question about this earlier today, and was told not to worry, see link: http://serverfault.com/questions/339320/what-happens-if-you-install-an-ssl-certificate Current problems: the web applications aren't accessible over https, http works though (if a valid cookie is available) there's a static html page at http://axa.tfo.be/incentives/cipres/static.html, even that page is only accessible over http My webhost is telling me that 'my application probably doesn't support SSL', and has asked me to set an SSL variable to true in my php code. So my questions: I have basic knowledge of php, but don't know where to start regarding the 'php ssl variable'. The sites have been online for some time, and have been developed for regular php access. (Google didn't bring me any help, either.) Can anyone point me in the right direction, or give me some clues about whether/what I should ask my webhost for further assistance? (I'm a bit on a tight schedule, the sites will be audited again on monday, and it's a customer i wouldn't want to loose...) Thanks for looking into this, and sorry if my questions sound a bit nooby - I'm a webdesigner, not a server specialist...

    Read the article

  • Canonicalization of single, small pages like reviews or product categories [SEO]

    - by Valorized
    In general I pretty much like the idea of canonicalization. And in most cases, Google explains possible procedures in a clear way. For example: If I have duplicates because of parameters (eg: &sort=desc) it's clear to use the canonical for the site, provided the within the head-tag. However I'm wondering how to handle "small - no to say thin content - sites". What's my definition of a small site? An Example: On one of my main sites, we use a directory based url-structure. Let's see: example.com/ (root) example.com/category-abc/ example.com/category-abc/produkt-xy/ Moreover we provide on page, that includes all products example.com/all-categories/ (lists all products the same way as in the categories) In case of reviews, we use a similar structure: example.com/reviews/product-xy/ shows all review for one certain product example.com/reviews/product-xy/abc-your-product-is-great/ shows one certain review example.com/reviews/ shows all reviews for all products (latest first) Let's make it even more complicated: On every product site, there are the latest 2 reviews at the end of the page. So you see, a lot of potential duplicates. Q1: Should I create canonicals for a: example.com/category-abc/ to example.com/all-categories/ b: example.com/reviews/product-xy/abc-your-product-is-great/ to example.com/reviews/product-xy/ or to example.com/review/ or none of them? Q2: Can I link the collection of categories (all-categories/) and collection of all reviews (reviews/ and reviews/product-xy/) to the single category respectively to the single review. Example: example.com/reviews/ includes - let's say - 100 reviews. Can I somehow use a markup that tells search engines: "Hey, wait, you are now looking at a collection of 100 reviews - do not index this collection, you should rather prefer indexing every single review as a single page!". In HTML it might be something like that (which - of course - does not work, it's only to show you what I mean): <div class="review" rel="canonical" href="http://example.com/reviews/product-xz/abc-your-product-is-great/">HERE GOES THE REVIEW</div> Reason: I don't think it is a great user experience if the user searches for "your product is great" and lands on example.com/reviews/ instead of example.com/reviews/product-xy/abc-your-product-is-great/. On the first site, he will have to search and might stop because of frustration. The second result, however, might lead to a conversion. The same applies for categories. If the user is searching for category-Z, he might land on the all-categories page and he has to scroll down to the (last) category, to find what he searched for (Z). So what's best practice? What should I do? Thank you for your help!

    Read the article

  • "Progressive" JPEG: Why do many web sites avoid rendering JPEGs that way? Pros, cons?

    - by Chris W. Rea
    When JPEG images are used by a web page, they are typically rendered top-down ... but they can also be rendered using a mode called progressive JPEG, where the image starts out full-size, but blurry, and then gets sharper with successive passes, until it's fully loaded. Progressive loading requires the image have been saved that way. Why don't more web sites use progressive JPEG? What are the drawbacks? Is it simply a lack of tool support, or are these files somehow inferior to traditional top-down rendered JPEG images?

    Read the article

  • SEO - why did my google search rank drop?

    - by Brian McCarthy
    My nutrition shop websites for tampa and brandon were coming up on page one of google search results and now they have dissappeared. The 2 websites serve different markets although they are close geographically and have the same products, keywords, and layouts. Brandon, FL is considered a suburb of Tampa, FL and could be grouped into the Tampa Bay area. There's also a mirror site nutrition shop setup for orlando. Is the google search ranking drop because: 1) from a flash banner recently added on the front page 2) is the site just being re-indexed on all search engines b/c of new content? 3) is it seen as duplicate content as there are separate websites for the cities of Tampa and Brandon but with the same content? What can be done to fix this? Thanks!

    Read the article

  • Curious about a cached old domain

    - by jogesh_p
    I am a bit curious about my new domain. I had a domain before, let's say http://example.com/. Before expiration of that domain, I bought a new one with the name http://another-domain.com/. I uploaded all of my content on the second domain, but now when I search in Google about some query related to my http://another-domain.com/ site then I also find my old domain in the results. Will this give a duplicate content error for my new domain or any other kind of penalty from Google?

    Read the article

  • WordPress page title repeated in SOME pages

    - by cmykrgbb
    I have created a Wordpress site and titles were working just fine. Then, some time and plugins installed later, I noticed that in SOME pages I get the title repeated 2 times. Example of wrong page title: Contact - NAME | NAME Example of normal title: Our Services | NAME Now, if I go to General Settings and change title it will change both, no improvement. SEO by Yoast has the option to reset page titles, but that just removes all titles leaving the current URL as page title, so no good either. Here is the code I originally had: <title><?php wp_title(''); ?><?php if(wp_title('', false)) { echo ' | '; } ?><?php bloginfo('name'); ?></title> Here is the code I am using now: <title><?php wp_title('|'); ?></title> To sum up, I think somewhere in the database there's a wp_title repeated: once using '-' as separator, another one (the current one) using '|'. Any help will be most appreciated, thanks!

    Read the article

  • Is there a Content Management System that allows multiple & independent blogs to be running on one domain?

    - by Ron
    Hello Webmasters, I am a Wordpress fan, and I'm now building a new site and I'm not sure which CMS can achieve what I'm trying to do. I am building a food blog network for a bunch of cities in the US, and I want to my city pages to be independently running blogs themselves. So basically... Home Page - Its own blog with its own users, talking about Food in general Dallas Page (child of home page) - Its own blog with its own users Chicago Page ..... so on and so forth. The web layout and design will be all the same, but just trying to achieve 25~50 independent blogs on one domain. How can I achieve this? I'm hoping that I don't have to install Wordpress into as many subdomains that I create... Thank you for your help in advance. -RP

    Read the article

  • DNS MX and NS entries

    - by unkown
    I was wondering about my domain and if next is afordable. First of all this is my "architecture": Domain registration at GoDaddy.com Hosting at Dreamhost mail at google apps Until now I setted up the google apps MX entries in my domain through the GoDaddy manager, but now what I want is to set up the hosting I have hired from Dreamhost. I understand that all I have to do is to setup next Dreamhost NS entries into the GoDaddy domain manager: NS1.Dreamhost.COM. 66.33.206.206 NS2.Dreamhost.COM. 208.96.10.221 NS3.Dreamhost.COM. 66.33.216.216 My question is, will my mail keep working right as soon as the MX entries I already setup into the GoDaddy are the Google Apps ones?

    Read the article

  • Contents farms, scrapers sites, aggregators real world examples? [closed]

    - by Marco Demaio
    Contents farm, scrappers, aggregators real world examples? Could you plz clarify me: efreedom.com is a scraper site, not a content farm? Because it simply copies and pastes contents from stackoverflow. ehow.com and squidoo.com are contents farm? They don't copy and paste contents they just generate fresh new user generated content, but too much and too quickly. expert-exchange.com is NOT a content farm or a scraper site, right?! It's simply that many people (an me too) hates it (they also wrote to Matt Cutts) because it shows up hight in Google providing a useless question with no answer. There are also many sites that act as 'contents aggregators in the form of specialized directories' (let's call them CASD), I don't know how to else define them. Do they have a specific definition? Anyway are these type of CASD contents farms or scrapers sites or what else? Basically these CASD search for all sites of the same type i.e. “restaurants websites”, they copy and paste the contents found in “Restaurant A” and create in their aggregator site a new page called “Restaurant A”, then they do the same for all websites of the same type, thus creating a sort of directory of restaurants. Later on these CASD also sends an email to the owner of “Restaurant A” (usually the email is on the website) with a user and password to let him modify/update its own page on the CASD site. Later on these CASD might ask for money to the owner of “Restaurant A” because they bring him traffic, otherwise they remove its page on the aggregator. Someone could call these simply directories, but I think a directory is different because is something you need to add your site into by filling a form and not something that steals contents from your existing site without a specific acceptance from the site's owner. I also really wonder how Google will sort out all these mess sites packed of contents that show up more and more and everywhere in search results.

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >