Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 88/216 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • I'm on Charter's email blacklist, how do I get removed?

    - by Mike Wills
    I am a programmer in a local government agency. I found out today when I got to work that Charter Internet has blacklisted us for our mailing list that we run. This list is used for communicating news and alerts for our residents. Does anyone have a phone # I can use to contact Charter about this? Normally the email approach is fine, but we have yet another snow storm coming and 75 of our customers won't be alerted to a snow emergency if we have one and may possibly be towed as a result.

    Read the article

  • how to localize new posts in asp.net...?? [closed]

    - by ntechi
    I am doing my final year project and have decided to make a website in asp.net. For that I'll be using Micrsoft Visual Studio 2008. I'm making a Real ESTATE properties website. I want to know how to localize or create new posts in asp.net( like in WORDPRESS) and also when I hit SEARCH it should search for the desired keyword or the searched post. If post is not possible then it should display pages...

    Read the article

  • How can I decrease relevancy of Creative Commons footer text? (In Google Webmaster Tools)

    - by anonymous coward
    I know that I may just have to link the image to make this happen, but I figured it was worth asking, just in case there's some other semantic markup or tips I could use... I have a site that uses the textual Creative Commons blurb in the footer. The markup is like so: <div class="footer"> <!-- snip --> <!-- Creative Commons License --> <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by-nc-sa/3.0/us/80x15.png" /></a><br />This work by <a xmlns:cc="http://creativecommons.org/ns#" href="http://www.xmemphisx.com/" property="cc:attributionName" rel="cc:attributionURL">xMEMPHISx.com</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/">Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License</a>. <!-- /Creative Commons License --> </div> Within Google Webmaster Tools, the list of relevant keywords is heavily saturated with the text from that blurb. For instance, 50% of my top-ten most relevant keywords (including the site name): [site name] license [keyword] commons creative [keyword] alike [keyword] attribution [keyword] I have not done any extensive testing to find out rather or not this list even matters, and so far this doesn't impact performance in any way. The site is well designed for humans, and it is as findable as it needs to be at the moment. But, out of mostly curiosity: Do you have any tips for decreasing the relevancy of the text from the Creative Commons footer blurb?

    Read the article

  • How likely are IE9 jumplists to be useful?

    - by Grant Palin
    Having installed the Internet Explorer 9 release, I've experimented with the jumplists feature available in Windows 7 - drag a site tab down to the taskbar to create a jumplist. Works for Facebook and Twitter, anyway. I have my suspicions about the utility of this feature - it's a neat and possibly useful feature, yet is limited to the combination of IE9 and Windows 7, plus sites implementing the appropriate code. Given the relatively small audience at this point, is there any value in adding code to support this feature? And would it likely be more useful for a web application (e.g. Twitter, Facebook) than a typical website?

    Read the article

  • CDN for site with target market in Australia

    - by Jae Choi
    I was told that http://www.edgecast.com/ is very good CDN provider for Australian market. I have a cloud server based in Sydney Australia but was wondering whether it's even worth getting cdn as my target market is only Australia based also. Would I see any performance gain if I use above CDN services or would this be more for sites that target international visitors? I have Apache installed in our server but I would like to install Nginx. Would I see much more gain in performance on this change than CDN or should I go for both as they are all beneficial?

    Read the article

  • How to evaluate SEO/prominence improvement [on hold]

    - by Rober
    I will work on a website SEO and before starting with it I would like to "take a snapshot" of the present status so that I will be able to compare it with the new situation in a few months and evaluate my work and the real improvement. I don't mean whether the website is well implemented or not, but how well it is seen by Google and others. What prominence it has. I am taking some variables from Google Analytics (average day visits...), from Google Webmaster Tools (Search traffic and average position...) and some other indicators, like automatic SEO audit figures (website estimated worth, real pagerank...). What would you look at before starting SEO improvement?

    Read the article

  • Ways to serve AWS from another domain

    - by mplungjan
    I have installed Ghost on AWS (it is running node) I very much dislike the URL they gave me http://ec2-nn-nnn-nnn-nnn.us-west-2.compute.amazonaws.com/ghost/ I own a domain and linux hosting (but not a VPS) - what would be a practical way to serve my blog via URLS on my own (sub) domain? I can use php and access .htaccess on my domain - possibly do things on the ASW instance too (let me know what to look for)

    Read the article

  • How can I making Twitter, Facebook and Reddit share buttons load last?

    - by Daniel Bingham
    I have a website with a number of pages that sport twitter, facebook and reddit share buttons. They take forever to load and until they do the rest of the page doesn't load. So how I can make them load last? Currently, they are loaded something like this: <div class="item"><a href="http://twitter.com/share" class="twitter-share-button" data-count="vertical" data-via="FridgeToFood" data-related="danielBingham:Recipe and update tweets from Fridge to Food.">Tweet</a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div> <div class="item"><script src="http://connect.facebook.net/en_US/all.js#xfbml=1"></script><fb:like layout="box_count" width="40"></fb:like></div> <div class="item"> <script type="text/javascript">reddit_target='recipes';</script> <script type="text/javascript" src="http://reddit.com/static/button/button2.js"></script> </div> They are in a div called "shareWrapper" and are loading to one side of the page. The buttons load where ever the script code is placed. As far as I know, I can't place the script code at the bottom of the page and move the resulting buttons after the fact. I want them to appear near the top, which right now means they are stopping everything below them from loading for several seconds. I tried loading them using javascript, but using JQuery's $(document).ready(), but that failed. It seems to leave the page in some sort of loading loop from which it never emerges. Are there other ways to get these to load last?

    Read the article

  • SEO: Single URL rewrite from one app to another

    - by user1909186
    I have two web applications running on two different servers. I want one, example.com/hello, to redirect to the second, hello.com. But I want both to contribute to each other's SEO ranking. What is the best way to accomplish this primarily for google search and for other search engines? I currently do a rewrite with permanent from example.com/hello to hello.com using nginx. Thanks for your help

    Read the article

  • Low pagerank backlinks - does Google penalize?

    - by Programmer Joe
    I have a new stock discussion forum and I would like to promote it. Specifically, I have two ideas in mind to help promote it: 1) Become a member at other stock discussion forums. Make high quality posts, build a good reputation, and leave a link to my own forum in a non intrusive way (ie. in signature or at the end of my posts). This approach makes sense because you can find other members in other forums that are interested in stock discussion and a backlink to your forum, as long as it is not done in an intrusive/spammy way, should come across as acceptable. 2) Promote my site by writing articles at Squidoo, Hubpages, etc. This approach also makes sense because that's what Squidoo and Hubpages is for. The problem with both these approaches is that when I leave a backlink to my site, the page that I am leaving a backlink from may have a low PR - most likely, a PR of 0. Now, I have read that after the Penguin update by Google, your site can be penalized if you have too many backlinks from low PR pages: http://www.entrepreneur.com/article/224339 So, I am caught in a dilemma: a) If I start promoting my site via other stock forums, Squidoo, Hubpages, etc, but the backlink to my site comes from a page with low PR, Google may penalize my site. b) However, if I don't promote my site, nobody will ever discover it (aside from other promotion techniques like social media promotion, directories, etc).

    Read the article

  • What is correct heading setup for subpages

    - by user1010609
    What is the best for seo of the following: using <h1>keyword</h1> in layout and putting each subpage title in </h2> using <h1>keyword</h1> only for main page and on each subpage replace it to <h2>keyword</h2> and using h1 tags for subapge title not using <h1>keyword</h1> on any of the pages instead put keyword in in header and use for each subpage and using <h1>keyword + something for main page title</h1> None of the above (please go into as much details)

    Read the article

  • separate domains vs subdomains [duplicate]

    - by Sharon
    This question already has an answer here: Registering multiple domains vs. subdomains 5 answers We manufacture a very versatile product used in a wide variety of products using multiple brands. In order to market these brands, should we create a separate domain for each brand/market or use subdomains from our well established main domain? What would be best for SEO without breaking the bank?

    Read the article

  • Best URL for cars related website? [duplicate]

    - by Claudio ??is Mulas
    This question already has an answer here: What is the best stucture of SEO friendly URL? 3 answers If this was your website, what will be the URLs for each car on sale? http://www.autoscout24.eu/Details.aspx?id=247572735&asrc=ha I'm working on a car dealership website. What should be the best URL? Consider also that the company can have more models of the same car. I'm not asking for a url scheme, there are a lot of similar questions. My question is: in a car dealership website what is the best url for a car? What are by you the best variables I've to put on the url. Brand, model, year, location, color, miles/km, etc. This website, that url, this particulary case: what will you choose for urls? (even not in the following list) audi_q5_2009.html audi_q5_2009_used.html audi_q5_2009_used.html audi_q5_2009_used_in_alcobendas.html audi_q5_2009_used/247572735.html

    Read the article

  • Create a filter to consider http://example.com/foo/bar as http://example.com/index.php/foo/bar

    - by magnetik
    I'm using URL rewriting to make my url http://example.com/foo/bar/ to http://example.com/index.php/foo/bar. I'm not linking the index.php/.. url anywhere, but for some reasons, some users arrives to the index.php url. In Google analytics, I have a lot of duplicates that are quite annoying to follow up the traffic. I've watched the Advanced filters but I'm struggling to make it works fine. Any regex and google analytics pro to help me out ?

    Read the article

  • Does google contribute ranking from cdn.example.com to example.com?

    - by DesignerGuy
    Background From my understanding, http://mywebsite.com/image.jpg, can help the ranking of http://mywebsite.com in a search engine, such as Google (obviously the search engine of primary concern). So, SEO-wise, moving an image to http://whatever-cdn.com/my-account/image.jpg is bad. A popular solution is to use a CNAME record, such as http://cdn.mywebsite.com, so that image.jpg can be accessed at http://cdn.mywebsite.com/image.jpg. The question Does http://cdn.mywebsite.com/image.jpg rank as effectively as http://mywebsite.com/image.jpg ? Does it help boost the main http://mywebsite.com ? Or, does it rank independently because it is a subdomain? Is there another option (a way to use a CDN without sacrificing ranking)?

    Read the article

  • Does Google sometime prevent new white hat sites from ranking at all in some verticals?

    - by JVerstry
    Assuming someone wants to implement a new viagra or akai berry e-commerce website. There is a lot of competition and this site does not really bring something new, other than a new online counter to buy products at a nice price. Assuming this site does not use any black hat techniques and that it stays with Google quality guidelines, and assuming it has no (or few) backlinks (from non-authoritative websites). Assuming this website's pages are indexed properly in Webmaster Tool, and that no penalties are reported. No site improvements are suggested. Google crawls the site daily as reported in GWT. No robots.txt configuration issues. Does Google sometime decide to no rank this site for any user query (for weeks), because of lack of original content? The reason I am asking this is that I am trying to understand the possible cause of a similar situation I am observing with two sites. If so, what is the way out to start ranking for these site? If not, does it mean the cause is elsewhere for sure? Any confirmed info to get out of the maze is welcome.

    Read the article

  • Sendmail encrypted

    - by user1948828
    I manage a website running on Apache. It has public and private areas. When people apply for an account to access the protected portions of the site, they do a TLS/SSL protected POST containing their information which is saved to a (hopefully) nonpublic directory on the server. Then I have a python script which takes URL Encoded POSTS with this user information, sends back a plaintext confirmation to the applicant, encrypts their information with a freeware java command-line utility to protect it (specifically this one: http://spi.dod.mil/ewizard.htm), base64 encodes them, puts them in a file as a mime attachment and uses sendmail to forward the file information to my (and several coworkers' scattered around the country) email account(s) on an Exchange server with Outlook clients. This has worked well for years, but is awkward because it involves manually decrypting the information on a windows box once it is received, using the above mentioned encryption utility. This significantly limits how many can be processed. I would like to be able to encrypt my information in a format that Outlook/Exchange can inherently understand and display so that these emails can be viewed simply by clicking on them. I do have company provided PKI public certs for all the people I need to send to, and am able to send/receive encrypted emails on Outlook manually, but would like to know how I can send to Outlook from apache/linux/python from the command line using the same PKI certs. Dont need to receive them, just send. Is there a utility that can do this? I had thought pgp might but I havent been able to figure it out.

    Read the article

  • Abnormal alexa ranking score

    - by SteenhouwerD
    I have 2 websites of ecommerce in France and for these websites I have a strange behaviour of the alexa results. Here are some statistics about the websites : Unique Visits January 2012 Website A : 158,828 Website B : 58,867 Number of Search Results google Website A : 5,100 Website B : 56,000 Links to my site Website A : 3,120 Website B : 2,180 ALEXA Score Website A : 405,804 Website B : 278,944 How does it come that website B with 1/3 of the visitors of website A have a much better Alexa Score ( x2 ) then website A?

    Read the article

  • Does DFP Small Business allow geotargeting?

    - by Eric
    I'm working with a blog that has an advertiser who can only show ads for US/UK... so I'd like to set up an ad server that will show those advertiser's ads for US/UK customers, and then show Google Adsense ads for all other countries. It seems like DFP Small Business (Google's free ad server product) will do the job for all of this, but I'm not 100% certain it allows geotargeting as I've described. Is that possible?

    Read the article

  • Forum engine with full LDAP integration [closed]

    - by Andrian Nord
    We are looking for forum engine which may actually maintain user data into LDAP, maybe via mods. Core point is about ability to maintain the data, i.e. all user profile settings, like nickname, password, email, avatar, birthday and others (preferably configurable). One example of good ldap integration, level of which I'm expecting, is drupal's ldap integration, which allows to map any user's attribute into ldap and keeps it in sync with database. Year ago I've done a small research over existing Free&FOSS engines and find out few forum engines with LDAP integration, namely SFM, phpBB and something else. The most maintained solution were provided by phpBB3, which supports LDAP integration out-of-box, but it is unable to sync data with changes in LDAP server made by other software. Actually it wasn't even propagating changes back, I'm not saying about ability to map additional attributes (other than name/password/email). Also, I haven't found any forum with architecture which have proper abstraction over user settings, thus I doubt that this engines (including phpBB) are possible to mod such functionality without introducing dramatic changes into core codebase. More recent research showed that even some commercial software, like IPB is unable to keep it's database synced with LDAP directory and map additional attributes. In other words, all support I've seen so far is simple user creation upon first user's login, which is not good for us, as forum is not primary site and should not maintain it's own users base (to reduce risk of possible collisions). LDAP import is required due to many other services (ftp, email, jabber, drupal site) using same users base. Currently we have forum embedded into Drupal site, but we are unsatisfied with it's features. BTW, we are using Linux and this is not duplicate of this question, as it's author seems to be satisfied with behaviour described above. So, my question is: Are there any (preferably FOSS&free) forum engines that may import, export, keep in sync, or otherwise integrade with LDAP user database (preferably with ability to map additional fields to ldap attributes)?

    Read the article

  • Which registrar checks the most domains?

    - by Christian W
    When I want a new domain, I usually use GoDaddy to check, and another registrar to register. This is because GoDaddy check my wanted domain against the most TLD's. Are there any other sites/registrars that checks against more TLD's? What I want is to type my wanted second-level domain.. Ex: bobsplace. And then it searches through bobsplace.com bobsplace.net bobsplace.me etc, and reports back to me which is availible or not

    Read the article

  • Domain Name Expired, Will My Backorder Work?

    - by Trent Scott
    I'm interested in a domain name that expired August 9, 2012 and backordered it a few months ago. When I check the status of the domain name, it is listed as "autoRenewPeriod". It has a new expiration date of August 9, 2013, but a Google search indicates that "autoRenewPeriod" means the registrar automatically renewed the domain but has not received payment yet. Does anyone have experience with this? How long will it stay in "autoRenewPeriod" before being released by the registrar? Do I have a good chance of grabbing the domain name?

    Read the article

  • What kind of website or coding is suitable and safe for an artist's website

    - by Dan S
    I have a web design project that is related to a singer, and I used Joomla for my previous project and designed good music websites. But for this project I cannot find a suitable template to edit and use. As the website is so simple and does not have any special functionality, I'm thinking about creating a website with just simple CSS, html and jQuery. I'm Good at them and can make a perfect look but I am not sure about the security. In Joomla I use different security plugins but do not know about a client-side scripting. So generally I need your ideas, about the following questions: - Is Joomla and generally CMS a good option for a music website? - How famous artists' website is base on? CMS or Client-side scripting? - Do you recommend to create it manually without using and CMS or template? - An do you suggest WordPress for this type of websites? (The website will have these pages: Biography, News, Music (with a music player), Photos, videos and contacts). That's it! Thank you for all your responds, I had a look at Joomla and the only template I chose is This One which seems very simple, and I am worry about module position, because it seems does not have any module position at all. I tried to contact the provider but did not get any respond. Does anyone know about its module position, I mean is there any way to find them? An is it possible to create a 2-3 module positions? Also I had a look at ThemeForest's WordPress templates and it has such a great template. I think WordPress is more active in creating artistic templates. But is it secure and professional to use this CMS for a singer who is kinda famous it his country? I am talking about a template like this. Share your opinions guys.

    Read the article

  • Goal completions 10x higher in dashboard

    - by cjk
    I have the following table in my Dashboard: Page path level 1 Visits Goal Completions ----------------- ------ ---------------- /sub1/ 994 1,295 / 102 3 /sub2/ 10 1 I know my conversion rate is 10-20%, and that actually in this period I only had 183 goal completions under /sub1/. My goal is set as a regular expression for a particular page (/success?.*), and I have a funnel set up which tracks the page before the goal (/action). The actual urls hit would be /sub1/action then /sub1/success?1234 and /sub2/action then /sub2/success?1234. Why is my table in my dashboard giving me wildly wrong numbers? Have I done something wrong?

    Read the article

  • How does one block unsupported web browsers?

    - by Sn3akyP3t3
    Web browsers with an end of life no longer receive security updates which not only makes them vulnerable to the end user, but I imagine its not safe for the server's which receive visits by them either. Is it practical to block or enforce and notify the end user that their browser is unsafe and unsupported? If so, how would one achieve that? I don't know of any official or crowd-sourced listing with that information to parse and keep up to date. I'm aware that the practice can be custom built with User Agent parsing and feature detection for HTML5 enabled browsers.

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >