Search Results

Search found 9727 results on 390 pages for 'llblgen pro'.

Page 225/390 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • how to edit this theme

    - by sanjana
    i recently created a community with opensource php platform.there is a usefulness page link in my theme.it has been created by theme creator.i have try to contact him for getting help about this problem.but he didn't help me.so i decided to get help from sitepoint members.this page will redirect my site to member list section.but i doesn't like it.i want to remove it from my theme.how can i do it???i know a little about php and css...what is the template section to edit????please help me.....here is that usefulness link

    Read the article

  • How to reverse engineer the SEO on a website?

    - by Startup Crazy
    I have read this question. My question is a bit different from it. I want to know how can I reverse engineer another website that is ranking the best for some keywords. For example some website called www.bla.com is there and it ranks high for many keywords and I want to learn from it how can my website be of the same authority and get the same ranking (or probably better ranking if I found something that they are missing). Can anyone enlist it as a procedure, how to reverse engineer a website?

    Read the article

  • Django - need to split a table across multiple locations [closed]

    - by MikeRand
    Hi all, I have a Django project to track our company's restructuring projects. Here's the very simple model: class Project(models.Model): code = models.CharField(max_length=30) description = models.CharField(max_length=60) class Employee(models.Model): project = models.ForeignKey(Project) employee_id = models.IntegerField() country_code = models.CharField(max_length=3) severance = models.IntegerField() Due to regulations in some European countries, I'm not allowed to keep employee-level severance information in a database that sits on a box outside of that country. In Django, how do I manage the need to have my Employee table split across multiple databases based on an Employee attribute (i.e. country_code) in a way that doesn't impact anything else in the project (e.g. views, templates, admin)? Thanks, Mike

    Read the article

  • Google indexed my home page incorrectly: How can I fix it?

    - by louis_coetzee
    I finished my website and launched it, I think I had a problem with my robots.txt - so I changed it to look like this: 03/08/2012 # Allows all bots Sitemap: http://www.mysite.co.za/sitemap.xml User-agent: * Disallow: /dashboard/ When I google my domain.co.za - I get this back: Home A description for this result is not available because of this site's robots.txt – learn more. You've visited this page 3 times. Last visit: 2012/08/15 Now since I fixed this and added a 301 redirect to redirect mysite.co.za to www.mysite.co.za I would love it if google bot would come do a visit. Is there anything I can do to get this fixed?

    Read the article

  • Using Minified Page Specific JS [migrated]

    - by Mike C
    I've been working on a rather large scale project which makes use of a number of different pages with some very specific Javascript for each of them. To lessen load times, I plan to minify it all in to one file before deploying. The problem is this: how should I avoid launching page specific JS on pages which don't require it? So far my best solution has been to wrap each page in some additional container <div id='some_page'> ...everything else... </div> and I extended jQuery so I can do something like this: // If this element exists when the DOM is ready, execute the function $('#some_page').ready(function() { ... }); Which, while kind of cool, just rubs me the wrong way.

    Read the article

  • Google Analytics: Track user usage and flow

    - by Quintin Par
    Can someone help to query Google analytics to track a specific user behavior and usage pattern? Currently I pass user id’s to GA as _setCustomVar(2, 'id', id, 1); This is session based. But I am yet to master how I can utilize this to view usage pattern & behavior for the passed id. Say, I need to understand the visualization flow for one id or the page view count for that id etc Rephrasing, can I filter all existing reports for a specific id that I can select?

    Read the article

  • I want something ready to start with

    - by BDotA
    I am looking for something quick like a weblog in wordpress or blogspot maybe, that when I write a blog post I can put it there, for example if I write something about .NET or Java or Database,..some quick tutorial with some small code samples that visitors can use ... And I don't know anything about webdesign and I just want a ready-made thing to use for this purpose. What do you suggest? any samples of that that I can take a look?

    Read the article

  • Tracing and blocking ip's

    - by Mark
    I was wondering if there is anyway you can block or exclude people from submitting multiple fake ip's online . For example I have a google pay-per-click campaign set up on adwords. I downloaded a program in minutes which enabled me to hide or give out a fake ip address which also allows you to use a different ip each time. I tried this out on my own link on google which in turn got through adwords and I was charged for the click. My question is how would you be able to counter or block someone who continuosly clicks on your link with a different fake ip each time?

    Read the article

  • GZipped Images - Is It Worth?

    - by charlie
    Most image formats are already compressed. But in fact, if I take an image and compress it [gzipping it], and then I compare the compressed one to the uncompressed one, there is a difference in size, even though not such a dramatic difference. The question is: is it worth gzipping images? the content size flushed down to the client's browser will be smaller, but there will be some client overhead when de-gzipping it. Please advise.

    Read the article

  • Most standard / Best way to keep the same top menu among different web pages?

    - by jsoldi
    What's the standard way to keep the same menu on top among different web pages without having to duplicate it on each page (I don't mean that it doesn't reload like when using frames and only loading the bottom part; I want the menu to scroll with the page when scrolling down, like this, this, this and pretty much every single web page that exists). I found this answer but the guy can't use Php and I can. Plus, I see several people giving different suggestions, but I assume there is a standard since pretty much every single web page in the whole web have a menu on top that stays the same among multiple pages . I'm just a newbie on web design (I can program Php and Html easily but I have no idea about standards and stuff like that since I'm self-taught guy ;)). What I would normally do is to include the menu with php but I'm not sure if this is the "standard".

    Read the article

  • Z-index preventing on hover attribute on another element [migrated]

    - by user18294
    I have two different elements (div class="") within a larger container. Let's call them div class="overlay_container" and div class="title." The div class="overlay_container" has a subclass, .image, which creates an overlay over the entire larger container on hover. The div class="title" has a z-index of 10,000 and lies over .image and therefore over the overlay. Unfortunately, when you hover over "title," the subclass overlay image underneath disappears. I know the problem is obviously that the "title" div is right over the other divs and therefore the on hover will disappear due to the z-index. But how do I fix this? How do I make it so that when you hover over the "title," the .image overlay still appears? If your answer involves jQuery, could you please tell me where to put the script (before the /head tag)? Thanks!

    Read the article

  • Help with google pages; using them in general

    - by Sugarcube
    Is there any way someone could help me with building a website using google pages? I cannot for the life of me figure out how to do much more than create a new page, and I have been up all night. I thought I knew computers and websites but apparently I need a refresher course. Please, if you can help, I would be most appreciative. -Sugarcube Edit: Seems this is too vague. But honestly, I don't even know where to begin with the way Google has it set up. I tried to set one up with a template and a layout, but the layout has pages that I did not create and that I don't know how to edit to make them suitable to my website. I guess if I had to start somewhere it would be on how to make it do what I want it to do.

    Read the article

  • Making sub domain the new main domains for ssl

    - by Dean Legg
    What would be your best advise for changing your main domain to a sub domain? Are site used to be example.co.uk but has now changed to https://secure.example.co.uk/. Any example.co.uk url's re-direct to the new secure domain. Effectively the example.co.uk is now just there to redirect any links and is no longer part of the sites url structure. I have added a new domain to Google Webmaster https://secure.example.co.uk/ and added the site map. Waiting for it to be indexed. Is there anything else you would advise and will this take away a lot of the juice from all the links I developed for example.co.uk? Guessing this is not best practise as I have struggled to find any information online based on this subject.

    Read the article

  • Drupal node access for anonymous users

    - by MrDresden
    I've never used Drupal before so this may be something that can easily be remedied, and that would be awesome. My problem is that a block, containing node information can't be viewed by anonymous users (unregisterd/not logged in), gives a "You are not authorized to access this content." message, but shows up for logged in users. The nodes that the block contains are events, so the block shows events for the next week. I've checked the users access settings but can't find anything that could possibly remedy this. I'm using drupal core 6.26, Event 6.x-2.x-dev, Event views 6.x-2.4 If anyone has any information, or solutions, I'd greatly appreciate it.

    Read the article

  • What is the correct frequency of changing content regularly?

    - by SSRB
    What is the correct frequency of changing content regularly? Suppose I have a site "Seven Sea" having 5 links name as Home, About Us, Product, Sitemap, Contact Us. It is good for site to change the site content regularly. But is there is any minimum and maximum frequency for do this job. Suppose I do change my content daily then is that good for SEO point of view. OR suppose I change my content once in a year Is that bad for SEO. What is best or more better choice? A REQUEST: If this type of question already answered then give me that answered link and do not close the question.

    Read the article

  • Change from static HTML file to meta tag for Google Webmaster verification

    - by Wilfred Springer
    I started verifying the server by putting a couple of static HTMLs in place. Then I noticed that Google wants you to keep these files in place. I didn't want to keep the static HTMLs in, so I want to switch to an alternative verification mechanism, and include the meta tags on the home page. Unfortunately, once your site is verified, you never seem to be able to change to an alternative way of verification. I tried removing the HTML pages. No luck whatsoever. Google still considers the site to be 'verified'. Does anybody know how to undo this? All I want to do is switch to the meta tag based method of site ownership verification.

    Read the article

  • Google Chrome Browser does not display the images from Local network or local host? [migrated]

    - by Arjang
    While developing and testing a site on local network I have noticed that while IE/FireFox would display the images coming from URL's like "http:\mypc" or "http:\localhost" or ip adress on the netwrok , google chrome on Galaxy, Windows Machine, or google Android will not display the images. After having burned too many hours trying various CSS, image src tag, html document description tricks, I have given up. During my search I found (and lost) the link to chrome forums regarding this issue that seemed to say it is just so by users. Anyone has any more information on weather it is possible to view images on local host/computer name on netwrok or ip adress?

    Read the article

  • Combining a content management system with ASP.NET

    - by Ek0nomik
    I am going to be creating a site that seems like it requires a blend of a content management system (CMS) and some custom web development (which is done in ASP.NET MVC). I have plenty of web development experience to understand the ASP.NET MVC side of the fence, but, I don't have a lot of CMS knowledge aside from getting one stood up. Right now my biggest question is around integrating security from ASP.NET with the CMS. I currently have an ASP.NET MVC site that handles the authentication for multiple production sites and creates an authentication cookie under our domain (*.example.com). The page acts like a single sign on page since the cookie is a wildcard and can be used in any other applications of the same domain. I'd really like to avoid having users put in their credentials twice. Is there a CMS that will play well with the ASP.NET Forms Authentication given how I have these existing applications structured? As an aside, right now I am leaning towards Drupal, but, that isn't finalized.

    Read the article

  • Is hidden content (display: none;) -indexed- by search engines? [closed]

    - by user568458
    Possible Duplicate: How bad is it to use display: none in CSS? We've established on this site before (in this question) that, since there are so many legitimate uses for hiding content with display: none; when creating interactive features, that sites aren't automatically penalised for content that is hidden this way (so long as it doesn't look algorithmically spammy). Google's Webmaster guidelines also make clear that a good practice when using content that is initially legitimately hidden for interactivity purposes is to also include the same content in a <noscript> tag, and Google recommend that if you design and code for users including users with screen readers or javascript disabled, then 9 times out of 10 good relevant search rankings will follow (though their specific advice seems more written for cases where javascript writes new content to the page). JavaScript: Place the same content from the JavaScript in a tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser. So, best practice seems pretty clear. What I can't find out is, however, the simple factual matter of whether hidden content is indexed by search engines (but with potential penalties if it looks 'spammy'), or, whether it is ignored, or, whether it is indexed but with a lower weighting (like <noscript> content is, apparently). (for bonus points it would be great to know if this varies or is consistent between display: none;, visibility: hidden;, etc, but that isn't crucial). This is different to the other questions on display:none; and SEO - those are about good and bad practice and the answers are discussions of good and bad practice, I'm interested simply in the factual 'Yes or no' question of whether search engines index, or ignore, content that is in display: none; - something those other questions' answers aren't totally clear on. One other question has an answer, "Yes", supported by a link to an article that doesn't really clear things up: it establishes that search engines can spot that text is hidden, it discusses (again) whether hidden text causes sites to be marked as spam, and ultimately concludes that in mid 2011, Google's policy on hidden text was evolving, and that they hadn't at that time started automatically penalising display:none; or marking it as spam. It's clear that display: none; isn't always spam and isn't always treated as spam (many Google sites use it...): but this doesn't clear up how, or if, it is indexed. What I will do will be to follow the guidelines and make sure that all the content that is initially hidden which regular users can explore using javascript-driven interactivity is also structured in way that noscript/screenreader users can use. So I'm not interested in best practice, opinions etc because best practice seems to be really clear: accessibility best practices boosts SEO. But I'd like to know what exactly will happen: whether any display: none; content I have alongside <noscript> or otherwise accessibility-optimised content will be be ignored, or indexed again, or picked up to compare against the <noscript> content but not indexed... etc.

    Read the article

  • redirect url ending with dot

    - by Michael
    I submitted my site's URL to my workplace's printed newsletter and when I get the printed version, they added a dot to the end of it. Some people will realize that the period is not a part of the url but others will not. Is there an easy way to redirect from http://example.com/home. to http://example.com/home? I have a IIS 7.0 shared hosting with GoDaddy. This means I have access to the box only through their interface so some options might be limited.

    Read the article

  • How does 301 redirection work across the network? & should I use it if there is a chance we made need to change the resource back to the original URL?

    - by Faust
    I've built a CMS that makes it fairly easy for my client to relocate pages in their site hierarchy. This site has all human-readable and intuitive URLs, so moving a page necessarily means that its URL changes. I am storing records of each resource's past URLs in the data store so that requests for bygone URLs are re-routed to their appropriate successors. I'm warning my clients not to re-arrange the site willy-nilly (for numerous reasons). But nevertheless I suspect there's a chance page moves could get reversed from time to time. So I'm trying to figure out whether 301 or 302 or 307 redirects should be used when serving up pages to requests for out-of-date URLs. I understand the value of using 301 for search engine optimization. But my concern is with this system possibly inadvertently making some pages unavailable to some users QUESTIONS: That is, if the clients move a page at location/URL A to a new location B, then users get the redirect for A to B, and then the clients move the page back to A again, how long can I expect any of those users to keep getting their requests for A redirected to B -- in this case sending them to my friendly 404 page? Is it until an item in their browser history is cleared? Is the redirect somehow cached in routers throughout the internet? How does this work? How long can I expect the 301 redirect to linger out there ?

    Read the article

  • Google Analytics custom variables and how are they recorded?

    - by mrtsherman
    I have been asked to add GA custom variable tracking to my company's website. The company website uses server side includes, so making modifications to the tracking code happens identically everywhere. Maintenance is therefore a headache. Also, GA takes about twenty-four hours for custom variables to start showing up in reports and that makes troubleshooting a headache. So if you have custom variables // visitor level tracking, id = 12345 _gaq.push(['_setCustomVar', 1, 'id', '12345', 1]); // page level tracking, email = [email protected] _gaq.push(['_setCustomVar', 1, 'email', '[email protected]', 1]); The marketing people want the following out of this: User visits site and we record a unique id for them. Whenever they return this id will be used in GA. User signs up for our newsletter on page X and we record their email address. Whenever they return this email address is used in GA. Now a big problem for me is that I don't use GA and the marketing people don't use custom variables. So we don't actually know how this will work. Do I want Page, Session or Visitor level tracking? What happens because the same GA code is used on every page? If they visit the email sign up form and we record the email address, but then they go somewhere else where email is nonexistent will the value get 'overwritten.' Sorry for the long question, but there are a lot of unknowns for a GA noob.

    Read the article

  • Is it possible to tell a search engine not to index a specific section of an HTML page? [closed]

    - by Justin
    Possible Duplicate: Preventing robots from crawling specific part of a page I know you can use robots.txt to ignore entire pages or sections of your site, but is there a way to tell cralwers like the Googlebot to ignore specific sections of an HTML page? I found this blog post that discusses one method, but it appears only to work for the Google Search Appliance, not the Googlebot. Is there some method for at least Google for to do this?

    Read the article

  • How to add SMS text messaging functionality to my website?

    - by jessegavin
    I want to add the ability to send reminders to people via email and SMS for specific events that they have signed up for on a web application that I am building. The email part is not difficult, but I am wondering where to find a good solution for sending SMS messages. It would also be a plus if this solution allowed two-way SMS communication with my web application so that people would be able to reply with a CONFIRM or CANCEL type of a message. Has anyone implemented something like this? Does anyone know of good tools out there? EDIT: I am realizing that this is more of a "lots of ways to skin this cat" type of question and so I changed it to community wiki.

    Read the article

  • How to push through a domain transfer in spite of the 60 day rule

    - by corsiKa
    I recently purchased a domain through a registrar which I won't name here. Within the first five minutes of logging in, I found a severe vulnerability that allows me access to all registration details of all users. Simply put, I do not trust this registrar with any kind of business. But I'm unable to transfer the domain because, for some reason, it has to exist in its current state for 60 days. We're planning to launch the site this weekend - we can't wait 60 days. But I can not trust this registrar: if I found such a severe vulnerability in the first few minutes, how many more similar un-trustables will I find in those 60 days? Is there a higher authority to whom I can submit a case to get my domain transferred to a different registrar?

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >