Search Results

Search found 31417 results on 1257 pages for 'site structure'.

Page 602/1257 | < Previous Page | 598 599 600 601 602 603 604 605 606 607 608 609  | Next Page >

  • Review of Agile Project Management Software

    - by John K. Hines
    Bright Green Projects have an admittedly older blog post entitled Review of Agile Project Management Software | Scrum Kanban Methodology. Since I haven't had time to review Scrum project management tools in quite awhile, it was nice to find a write-up that's as succinct as this one. The thing I like the best about Bright Green's site, besides the product, is the vocabulary they use to describe Agile software development. For example, the couple Scrum with the development methodology they're using (Lean Kanban). Many organisations simply say they're using Scrum, which itself doesn't proscribe any engineering practices. It would add some clarity for teams to adopt the Scrum-Method terminology. At least then you could know if you're walking into a Scrum-Chaos situation.

    Read the article

  • Cancelling your Windows Phone Dev account- is very difficult.

    - by Sahil Malik
    SharePoint, WCF and Azure Trainings: more information Here is Microsoft’s new business model. Make it so damned difficult to cancel your windows phone dev account subscription, that you will just give up trying and pay as the easy route out.Very sad that it has come to this. Usually I would not approach an open forum such as my site for such issues, but the sad state of their affairs leaves me with no choice. Here is the issue, last year, I opened a WPDev account, for which you have to pay. Seeing that its been a year, I haven’t submitted anything, I didn’t want to renew my account and pay the fee. I guess if I ever write a WP app, I will reopen the subscription. Sounds about right huh? fair? So, what would you expect? Login to your account, find the subscription, hit cancel! Right?No not really! Read full article ....

    Read the article

  • Good sites for sharing code snippets & pastes that you can share links to?

    - by acidzombie24
    I know there are site tools to check if your webpage is alive, has compression, etc but lets not get into that. What are useful sites to paste code in and to share links to it? The three i know are http://codepad.org/ shows source and runs code online http://www.pastie.org/ share source with syntax highlighting http://jsfiddle.net/ great for JS help or for the occasional test. What else do you know of? One answer per question. I'll let lints and validators slide since you do paste code into them. Mention a weakness if you do know one so others wont be surprised or disappointed.

    Read the article

  • Confused with X.ORG version number

    - by caligula
    There is my trouble: I'm trying to install ATI drivers on my Dell Vostro 3350 laptop. On the AMD's site I followed steps to determine which driver should I use. After selecting what hardware I use I got a link to download amd-driver-installer-12-3-x86.x86_64.run and below was a description of that driver that it was compatible with X.Org version from 6.7 to 7.6. And I do not know what version of X.Org my Ubuntu uses. Cause after googling I found that version is 7.6 in Ubuntu 11.10. But after typing: X -version I got the output: X.Org X Server 1.10.4 Release Date: 2011-08-19 X Protocol Version 11, Revision 0 So what is the exact version of X.Org in Ubuntu 11.10 and how should one reliably determine it? Can anyone construe? Thanks.

    Read the article

  • open source database project

    - by Jeff V
    What is the best way to build an open source database? I would like to build a database of all vehicles and the related maintenance information (i.e Oil Weight, Quantity, Tire Pressure, Windshield wipers etc). Currently this information is fragmented or just not put on line in an open way. Once collection began I would want to import into a DB and then be able to distribute freely. Is there a process (site or group) that I can start gathering this information in a reliable and verifiable way? Is there any issues that I should watch out for?

    Read the article

  • How can I exclude content in my notifications bar from being indexed?

    - by Liam E-p
    Of course I want my content to be indexed pretty fast by search engines, however not my notifications bar. My notifications bar contains the last 30 changes to content on the site, and I don't want this to show in my SEO meta. As all the notifications are generic, it often doesn't provide any relevant information. As I said the notifications are generic. If an article named "123" was created, it would create a notification that says "Article "123" was created by xxx at 12:00AM". I'm now wondering if this is a content design problem. As only 1/3 of this information is actually relevant to users (the title, what happened). By SEO meta, and irrelevant notification data being shown, I mean this - Basically what I was wondering, is how I could optimise this, so search engines wouldn't show this generic nonsense.

    Read the article

  • How can I alias domains to subdomains?

    - by user745668
    I have a main site with a bunch of subdomains created. Each subdomain is a blog and I want each blog to have its own domain name i.e. thisguy.com - blog1.mainsite.com thatguy.com - blog2.mainsite.com I bought the new domains and I set up the CNAME records as above to alias them to the appropriate subdomains. However, I get my hosts "a domain is pointing to one of our servers but we don't know anything about it" landing page. How can I set up these domains as aliases of my subdomains?

    Read the article

  • Graphic demonstrating emphasis of front end in web apps

    - by sohail
    I remember stumbling across an amusing graphic a year or so ago which demonstrated the tiers of web development. The back-end was shown as a tiny box, but the front end was shown as a huge box crammed with lots of front-end technologies like AJAX, DHTML. This is all a vague recollection. Does anyone know where on the Intraweb this graphic might be? It was probably on a programming cartoon site, but I only view XKCD on a regular basis, and I couldn't find it on there. Although tagged as fun, my request does have a productive edge to it - it would be quite useful in driving home to my colleagues how UI top-heavy web application development has become.

    Read the article

  • My blog which gets 300+ daily impressions has stopped appearing on the 1st page of Google

    - by Sangram
    I have a blog regarding Placement papers from December 2010. My monthly impressions are around 4000. For the last 2 days, my blog has disappeared from Google search engine result pages. Impressions have reduced drastically. Please check Stat reports: My blog is still on the search engine because when I search site: mydomain.com on Google, I can see my all pages indexed over there… But my pages which used to appear on the first or second pages of Google do not appear any more. Example: If I search with query GE round 2 code writing test on bing.com or Yahoo search, the first link on the result page is my blog. But if you do same on Google, my URLs do not appear even on the 1st 3 result pages. I used to get lots of visitors by these search query earlier.

    Read the article

  • Anti aliasing problem

    - by byronyasgur
    I am auditioning fonts on google web fonts and one that I was discounting was Ubuntu because it looked a bit jagged ( screenshot below taken straight from google); however afterward I read an article where it was mentioned as a good choice, and there was a screenshot where it looked really good ( to me anyway ). I am using windows 7 and have tried looking at it in chrome and firefox. I notice the same thing with some other fonts but this one is a good example because it looks perfect in the screenshot but not so good when I look at it on their site. I know this essentially is a question about setting my computer, but I thought that this would be the best place to pose the question: Is there something wrong with the settings on my machine seeing as it's obviously not showing the font the same on my computer as it did when the article writer downloaded it and used it in an image. The screenshot from Google ... The screenshot from the article above ...

    Read the article

  • SOLVED BleachBit: How to Completely Clear URL History in Firefox?

    - by tSquirrel
    14.04 / Firefox 29.0 I've been using Bleachbit to clear usage/file history, and for the most part it works great. However, it doesn't seem to clear the website hostnames out of the URL, at all. These addresses are not bookmarked. Also, the total URL isn't preserved, just the hostname. Visit site http://www.bluesnews.com/some_random_URL_string Exit Firefox Run Bleachbit, with ALL Firefox options selected Restart Firefox Check history: completely empty, other than bookmarked sites. www.bluesnews is NOT bookmarked Type "blue" which is Firefox automatically completes as "http://www.bluesnews.com/" Alternate Step #3: Use Firefox's built-in "Clear History" and select ALL entries with a time frame of "Everything". Same result as above. My inquiry in BB forums hasn't been responded to. I found Dan's proposed solution, however changing autocomplete in about:config only turns off the function, it doesn't actually stop storing URLs. SOLVED - See my comment in the "Answer" response from Tim

    Read the article

  • Photo management utilities

    - by Frantumn
    I'm about to develop a web site for a new client. It's not going to be very intense, but one requirement is that, if possible, the client wants to be able to manage the photo gallery themselves. Since they are not technically savvied at all, I was wondering what utilities exist that provide a GUI for users to log in to manage photos. Can anyone make a recontamination? I haven't purchased the web hosting yet, so if your answer requires a specific type of host server don't worry, I am open to options.

    Read the article

  • PowerPivot Workshop in Frankfurt (and London early-bird expiring soon) #ppws

    - by Marco Russo (SQLBI)
    One week ago I described the PowerPivot Workshop Roadshow that we are planning in several European countries. The news today is that the Workshop will be in Frankfurt (Germany) on February 21-22, 2011 ! The registrations are open on www.powerpivotworkshop.com web site. The early-bird price for Frankfurt will expire on February 4, 2011. And if you are willing to attend the London date on Febrary 7-8, remember that early-bird price for London is going to expire on Monday (January 17) ! Save your money...(read more)

    Read the article

  • My page no longer shows up in Google's results for a keyword

    - by user6456
    I have a small website about a commercial product, with a description and tutorial. 2 days ago it was in position 11th in Google search results, without any kind of SEO optimization on my part. Today it's gone. Totally gone - not even in the first 200 results. It's still very high in bing.com and duckduckgo.com. The site is very on topic. It's hosted under domain Keyword.com, and it's about commercial product which addresses the Keyword. How can I know what happened?

    Read the article

  • Preventing Duplicates on Google

    - by abel
    I am currently using a rewrite rule to enable access to .php pages, without using the php extension. However to prevent old links from breaking, the pages can still be accessed via links containing the .php extension too. For eg. domain.com/page.php can now be accessed at domain.com/page All the links on the website now use domain.com/page type links within the site. However older incoming links will still link to the .php pages, meaning Google will index both pages and mark them as duplicate. I have two plans to remedy the situation. Use a php 301 redirect: When a page is accessed with the .php extension, I can redirect each page individually using a 301 redirect using php Using Canonical: Place a canonical tag on each page, pointing to the ".php" less version My Question: Are both methods equally efficacious in preventing Google from indexing my ".php" pages? Which method should be preferred, by convention or otherwise?

    Read the article

  • What kind of spam is this?

    - by SSilk
    I realize this is a pretty vague question, but I occasionally get spam messages through my contact form on a Drupal 6 site. The contact form does not have any anti-spam protection (i.e. math question). The messages I get are all very similar and just jumbled junk, like below, so I think they're all from the same source. Example: ylsaf0V bpsdfuxnhjjd, [url=http://wwgfsggzgyjyjm.com/]wwgrfgzrgsjyjm[/url], [link=http://xmgvyghcuufvb.com/]xmjyhvyjyfjirovb[/link], http://frgxmdghrgruhfc.com/ Anyway, I'm just wondering what the point of such a message is. All the links are dead, it's illegible, and it's not trying to sell me a product or get me to do anything, so I'm a bit perplexed. Is there any way to tell where they're coming from? And how concerned should I be? To be clear, I'm not asking how to avoid them, I realize just adding a simple math challenge or captcha would likely do the job.

    Read the article

  • The new Google Analytics - what new useful features have you found?

    - by Rob
    If you don't know already a new version of Google Analytics has just come out. On first initial views it doesn't seem like much of an improvement on the previous version. There's lot's linking to Google's social stats but I'm yet to see the value of that. Also it doesn't seem to make the best use of the important data, it's tending to push referral sites, keywords to the back and bring the less important data to the front. Is that a sign of things to come??? One feature I did find interesting was the visitors flow as it shows the visitors path through your site. What new features have you found useful/interesting?

    Read the article

  • Panda 4: Reducing #indexed pages. How much is enough?

    - by Noam
    I've been hit by panda 4 (40% decrease). I didn't see any change during panda 1-3. From what I've read it and when compared to my site, the change is probably due to the fact that I have over 30M pages indexed on Google, and they've starting seeing that as some sort of bad indication. Although I feel all of the pages have a unique value that Google should crawl, it seems I should make some tough calls and deduce the indexed pages according to some prioritization I will conduct. The question is what should be my target, or what factors should help me figure out a relevant target. How many pages should I try to reduce to? - 25M - 15M - 1M - 2000 Is it enough to add noindex to low priority pages or should I also remove all internal linking to them?

    Read the article

  • Separate urls for a set of pages sharing 80% duplicate content

    - by user131003
    Issue: Currently my site has one particular page which has country specific data. So I've URLs like : mysite.com/sale-united-states mysite.com/sale-united-kingdom mysite.com/sale-sweden etc. All these pages have 80-90% common content and 10-20% country specific content. currently all these pages canonically point to mysite.com/sale-united-states. The problem is when someone searches for "sale Sweden", Google correctly shows mysite.com/sale-united-states page, which does not feel correct as it shows US page instead of Sweden. Now I'm thinking of not using canonical url so that country specific urls are produced in Google saerch. But I'm not sure how 80% duplicate content is going to affect SEO? What should be the recommended approach for this situation? A friend of mine suggested a "separate subdomain per country" based approach but it seems overkill for one page.

    Read the article

  • I have a large number of links on every page, for design reasons I want to keep it but is it hurting my SEO

    - by Callum Rexter
    The site is http://www.centralsaddlery.co.uk We have other issues which we are tackling in terms of content etc but the question I have is: "Is my main navigation hurting us in SEO?" Its a lot of links and it's on a lot of pages. If so - what is a way to get google to ignore links below the top level. I had thought google would see that the links are hidden by default and only shown on hover but I can't verify this at all. We absolutely want to keep the menu, our customers like it and so do we - we think it is pretty usable as we have a lot of products to look at. Any advice is appreciated (and any tips for any part of the SEO are welcome too)

    Read the article

  • Adsense alternative for a "Sex Education" website?

    - by WhatIsOpenID
    I am creating a nice and niche "Sex Education" website. No porn, nothing offensive and no scams. I would love to place Google Adsense but they do not allow ads on adult sites. I would like to know what advertising and link-exchange like should I place on my site. My sole objective is to cover the server costs and salary of one or two persons. (In this way, it is different from Best alternative to Adsense for a small website?).

    Read the article

  • Asus 1215n GPU driver/s don't give me a "full" OS experience

    - by AFD
    I'm use to not having specific drivers from a manufacture on my laptop when running a Linux OS and that has always been fine - there's been adequate FOSS drivers for my needs and it hasn't ruined any of my OS experience. When I bought an Asus 1215n one of the upsides to the hardware seemed to be the switchable GPU that could give lots of performance or lots more battery life and would switch on-the-fly... with Windows of course. Seems that the Nvidia driver are crap and people advise not installing them. I have some sort of workaround for vga_switcharoo (?) and the on-the-fly nature of the GPUs has turned in to a manual one :( The worst bit though (aside from shorter battery life) is the web experience with HTML5. If I visit Mozilla's Web O'Wonder site I'm told I don't have WebGL working due to driver issues. This really blows - is it possible that proprietary drivers can now ruin my web experience too?!

    Read the article

  • Remove third/nth level domains from google Index

    - by drakythe
    Somehow google has indexed some third(and fourth!) level domains that I had attached to my server temporarily, eg. my.domain.root.com. I now have these redirected properly where I would like them to go, however with a carefully crafted search one can still find them and I'd rather they not be exposed. My google foo skills have failed me in finding an answer, so I come to you wonderful folks: Is there a way/How do I remove sub-level domains from google search results? I have the site in google webmaster tools and verified, but all the URL removal requests I can perform append the url to the base url, not prefixed. And finally, how can I prevent this in the future?

    Read the article

  • Canon Multitasking printer: Scanner doesn't work, printer does

    - by Holger Böttcher
    Model: Canon Prixma MG2260 I have installed to ubuntu 14.04 Trusty Tahr. Now I have the problem by my multi function device. The printer works, but the scanner doesn't! I have downloaded the Linux software of Canon for the 2200 series. I have downloaded the software of the Internet site of Canon. 6 Zip folder 3 deb. and 3 rpm. Always 1 for computer, 1 for scanner, 1 for other Thinks. I can load them on my PC and unpacking, but they do not install themselves. Does any person know what I have to do about this?

    Read the article

  • Online Poker Game Programming

    - by Eyal
    I am trying to write a massive online multiplayer client for a poker site, where one user can be on a Flash client and the other on say an iOS client (iPhone / iPad), and would like to know how can interaction between two users be visible on both clients. What would be better to use? Should I use MSMQ? AJAX? Something other? I need the messaging layer (client interaction messages) to scale up to 100K+ online users to begin with. In other words; What scalable technology can I use to make game interactions between online users visible to all game participants?

    Read the article

< Previous Page | 598 599 600 601 602 603 604 605 606 607 608 609  | Next Page >