Search Results

Search found 14905 results on 597 pages for 'reporting tools'.

Page 4/597 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Ubuntu 12.04 LTS initramfs-tools dependency issue

    - by Mike
    I know this has been asked several times, but each issue and resolution seems different. I've tried almost everything I could think of, but I can't fix this. I have a VM (VMware I think) running 12.04.03 LTS which has stuck dependencies. The VM is on a rented host, running a live system so I don't want to break it (further). uname -a Linux support 3.5.0-36-generic #57~precise1-Ubuntu SMP Thu Jun 20 18:21:09 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux Some more: sudo apt-get update [sudo] password for tracker: Reading package lists... Done Building dependency tree Reading state information... Done You might want to run ‘apt-get -f install’ to correct these. The following packages have unmet dependencies. initramfs-tools : Depends: initramfs-tools-bin (< 0.99ubuntu13.1.1~) but 0.99ubuntu13.3 is installed E: Unmet dependencies. Try using -f. sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: initramfs-tools The following packages will be upgraded: initramfs-tools 1 upgraded, 0 newly installed, 0 to remove and 2 not upgraded. 2 not fully installed or removed. Need to get 0 B/50.3 kB of archives. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? Y dpkg: dependency problems prevent configuration of initramfs-tools: initramfs-tools depends on initramfs-tools-bin (<< 0.99ubuntu13.1.1~); however: Version of initramfs-tools-bin on system is 0.99ubuntu13.3. dpkg: error processing initramfs-tools (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates it's a follow-up error from a previous failure. dpkg: dependency problems prevent configuration of apparmor: apparmor depends on initramfs-tools; however: Package initramfs-tools is not configured yet. dpkg: error processing apparmor (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates it's a follow-up error from a previous failure. Errors were encountered while processing: initramfs-tools apparmor E: Sub-process /usr/bin/dpkg returned an error code (1) If I look at the policy behind initramfs-tools / bin I get: apt-cache policy initramfs-tools initramfs-tools: Installed: 0.99ubuntu13.1 Candidate: 0.99ubuntu13.3 Version table: 0.99ubuntu13.3 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise-updates/main amd64 Packages *** 0.99ubuntu13.1 0 100 /var/lib/dpkg/status 0.99ubuntu13 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise/main amd64 Packages apt-cache policy initramfs-tools-bin initramfs-tools-bin: Installed: 0.99ubuntu13.3 Candidate: 0.99ubuntu13.3 Version table: *** 0.99ubuntu13.3 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise-updates/main amd64 Packages 100 /var/lib/dpkg/status 0.99ubuntu13 0 500 http://gb.archive.ubuntu.com/ubuntu/ precise/main amd64 Packages So the issue seems to be I have 0.99ubuntu13.3 for initramfs-tools-bin yet 0.99ubuntu13.1 for initramfs-tools, and can't upgrade to 0.99ubuntu13.3. I've performed apt-get clean/autoclean/install -f/upgrade -f many times but they won't resolve. I can think of only 2 other 'solutions': Edit the dpkg dependency list to trick it into doing the installation with a broken dependency. This seems very dodgy and it would be a last resort Downgrade both initramfs-tools and initramfs-tools-bin to 0.99ubuntu13 from the precise/main sources and hope that would get them in step. However I'm not sure if this will be possible, or whether it would introduce more issues. I'm not sure how this situation arise in the first place. /boot was 96% full; it's now 56% full (it's tiny - 64MB ... this is what I got from the hosting company). Can anyone offer advice please?

    Read the article

  • Re-deploy only the reports on SCOM Management Packs

    - by Gabriel Guimarães
    I've migrated Reporting Services on a SCOM 2007 R2 install, and noticed that the reports have not being copied. I can create a new report, but the ones I've had because of the management packs are gone. I've tried re-applying the Management Packs however it doesn't re-deploy them and when I try to access for example: Monitoring - Microsoft Windows Print Server - Microsoft Windows Server 2000 and 2003 Print Services - State View - select any item and click Alerts on the right menu. I get the following error: Date: 12/24/2010 12:40:35 PM Application: System Center Operations Manager 2007 R2 Application Version: 6.1.7221.0 Severity: Error Message: Cannot initialize report. Microsoft.Reporting.WinForms.ReportServerException: The item '/Microsoft.SystemCenter.DataWarehouse.Report.Library/Microsoft.SystemCenter.DataWarehouse.Report.Alert' cannot be found. (rsItemNotFound) at Microsoft.Reporting.WinForms.ServerReport.GetExecutionInfo() at Microsoft.Reporting.WinForms.ServerReport.GetParameters() at Microsoft.EnterpriseManagement.Mom.Internal.UI.Reporting.Parameters.ReportParameterBlock.Initialize(ServerReport serverReport) at Microsoft.EnterpriseManagement.Mom.Internal.UI.Console.ReportForm.SetReportJob(Object sender, ConsoleJobEventArgs args) The report doesn't exist on the reporting services side. how do I re-deploy this reports? Thanks in advance.

    Read the article

  • Sitemaps showing twice in Webmaster Tools

    - by Andrew Lott
    Within my Webmaster Tools account I'm able to able to view details about Sitemaps that are uploaded for websites I manage. For some reason each sitemap is listed twice with the exact same details under both "By me" and "All". Even sites that don't have a sitemap yet tell me I have 0 sitemaps, twice... I originally thought this might apply to sites that have multiple "Users & Site Owners" in Webmaster Tools, but it even happens for sites that only I manage. I've checked other user accounts to compare and this doesn't happen; they just get one set of tabs for By Me/All, not two. What could be causing this?

    Read the article

  • Google webmaster tools / Geographic location settings

    - by JochemTheSchoolKid
    I am building an website. It has an .nl domain. Now only my domain is showing up on google.nl I hope I can change this somehow that it could be findable in all google's (like google.com / co.uk) and so on. If I look on google forums. They say go to webmaster tools and change your geographic position over there. But I have added this site and I am not able to change it there because there is no select box. I dont have any idea were to search (yes I searched on google offcourse) or where to ask for this special problem. So maybe here can someone redirect me or explain me what is possible and what not. The question is can I make an .nl domain findable in (almost) all google search sites? And so on how can I do that. Picture of my google webmaster tools (nl): http://i.stack.imgur.com/ZuP4L.png

    Read the article

  • EPM Architecture: Reporting and Analysis

    - by Marc Schumacher
    Reporting and Analysis is the basis for all Oracle EPM reporting components. Through the Java based Reporting and Analysis web application deployed on WebLogic, it enables users to browse through reports for all kind of Oracle EPM reporting components. Typical users access the web application by browser through Oracle HTTP Server (OHS). Reporting and Analysis Web application talks to the Reporting and Analysis Agent using CORBA protocol on various ports. All communication to the repository databases (EPM System Registry and Reporting and Analysis database) from web and application layer is done using JDBC. As an additional data store, the Reporting and Analysis Agent uses the file system to lay down individual reports. While the reporting artifacts are stored on the file system, the folder structure and report based security information is stored in the relational database. The file system can be either local or remote (e.g. network share, network file system). If an external user directory is used, Reporting and Analysis services also communicate to this directory. The next post will cover WebAnalysis.

    Read the article

  • Why is awstats reporting my static IP instead of Domain Name?

    - by Austin
    In AWStats under: "Links from an external page (other web sites except search engines)" it has generated a list of pages that had linked to my page. I see pages like: Bing, YouTube, HotFrog, etc.. However, there are many internal links within the pages. Towards the bottom it is reporting as followed: http://72.249.150.9/distributors.php 5 2.4 % http://72.249.150.9/contact/ 5 2.4 % http://72.249.150.9/catalog/ 4 1.9 % http://72.249.150.9/flex-point-hockey-grip.php 5 2.4 % http://72.249.150.9/sticky-grip-foam.php 5 2.4 % http://72.249.150.9/video.php 10 4.8 % http://72.249.150.9/dealers/ 5 2.4 % http://72.249.150.9/feedback/ 5 2.4 % http://72.249.150.9/products.php 10 4.8 % http://72.249.150.9/ergo-hockey-grip.php 5 2.4 %

    Read the article

  • Webmaster Tools word count

    - by Henrik Erlandsson
    Is there a way to somehow verify that the googlebot finds the headings and the content, for example by word count? I'm asking this because I tried a program called Screaming Frog, which fails to even fetch the first h1 on a validated page - for about 1/3 of all the pages(!) - and got insecure. Even though the site looks hunky dory in Webmaster Tools, I'd like to know what a googlebot-like content crawler finds on my page and in what order. Any tips on such tools is appreciated. This is not about keyword count.

    Read the article

  • webmaster tools 500 crawl error for asp faceted navigation that does not exist

    - by user19007
    i am getting 2,500 type 500 url errors in google webmaster tools. These pages are faceted navigation results that can not be reached by a site visitor. These pages do not exist. We are using faceted navigation with the Volusion platform (asp.net, I think). I have specified url parameters in webmaster tools so that google will not try to index anything faceted. This does not stop the errors from generating. I am concerned about how this might effect seo (bleeding page rank). I can provide additional information if needed. I am not sure how to solve this. I have started down the path of creating 301's, but having some difficulty there as well.

    Read the article

  • What tools exist to report bugs

    - by Luis Alvarado
    As of today I only knew about ubuntu-bug which I could use to report bugs about a specific program. But now I learned about apport-collect and apport-bug which basically are: apport-bug - Reports problems to launchpad using Apport to collect a lot of information about your system to help the developers fix the problem and avoid unnecessary questions and answers. apport-collect - Works as apport-bug but it send the information to an already existing bug report. In my case it was apport-collect 1060268 Thanks to Brad Figg in Launchpad. Are there any other tools to report bugs?

    Read the article

  • Most underestimated programming tool [closed]

    - by Anto
    We have many great tools which helps a lot when programming, such as good programmers text editors, IDEs, debuggers, version control systems etc. Some of the tools are more or less "must have" tools for getting the job done (e.g. compilers). There are still always tools which do help a lot, but still don't get so much attention, for various reasons, for instance, when they were released, they were ahead of their time and now are more or less forgotten. What type of programming tool do you think is the most underestimated one? Motivate your answer.

    Read the article

  • Best way to explain to someone that software developers need to install tools (mainly build integrat

    - by leeand00
    I work at a software company where most of the people are afraid to install new tools to increase productivity. They give me excuses like: I don't need to install something else. I can do this myself. etc...many other baseless arguments. In an ecommerece business, the end-users should not have to install anything, everything should be managed by them from the web, and the developers should be the ones installing things to increase productivity and teamwork i.e.: Version Control Systems Build Tools (ANT, NANT, Maven, continuous integration, CSS Frameworks) Integrated Development Environments Frameworks (Unit testing, etc) Etc... How else can I get my point across without sound crass?

    Read the article

  • Enable anonymous access to report builder in reporting services 2008

    - by ilivewithian
    I have a 2008 reporting services server installed on windows 2003 server. I am trying to allow anonymous access to the report builder folder so that my users do not have to select the remember password option when they login, if they are wanting to use the report builder. All I have found so far is that I should be able to do this with the IIS manager, but that only seems to work for reporting services 2005. Reporting services 2008 does not show up in the IIS manager, enabling anonymous access seems to be hidden somewhere else. How do I enable anonymous access to report builder in reporting services 2008?

    Read the article

  • Reporting Services Error 401: Unauthorized

    - by JJ
    I am trying to setup reporting services on a SQL server 2008 and get this error message when I try to connect to the http://localhost/reports The request failed with HTTP status 401: Unauthorized. I asked the server administrator and the IIS should be running (can't find it under administrative tools though) I used the Reporting Services Configuration manager to setup reporting services and am pretty sure it is setup correct.

    Read the article

  • Enable anonymous access to report builder in reporting services 2008

    - by ilivewithian
    I have a 2008 reporting services server installed on windows 2003 server. I am trying to allow anonymous access to the report builder folder so that my users do not have to select the remember password option when they login, if they are wanting to use the report builder. All I have found so far is that I should be able to do this with the IIS manager, but that only seems to work for reporting services 2005. Reporting services 2008 does not show up in the IIS manager, enabling anonymous access seems to be hidden somewhere else. How do I enable anonymous access to report builder in reporting services 2008?

    Read the article

  • IE 11 Developer Tools - changing console target to a different frameset or iframe

    - by vladimirl
    Originally posted on: http://geekswithblogs.net/vladimirl/archive/2013/10/25/ie-11-developer-tools---changing-console-target-to-a.aspxTo change current console iframe/frameset type this into console command line where "contentIFrame" in the iframe/frameset name (there should not be quotes around the iframe name):console.cd(contentIFrame);To return to the top level window, use cd() with no argument:console.cd();It took me some time to find out that this was possible in IE 11 Developer tools. Everything is so much easier in Chrome. No drama. Sometimes I feel that I hate IE more and more. Reference (http://msdn.microsoft.com/en-us/library/ie/dn255006(v=vs.85).aspx#console_in):All script entered in the command line executes in the global scope of the currently selected window. If your webpage is built with a frameset or iframes, those frames load their own documents in their own windows.To target the window of a frameset frame or an iframe, use the cd() command, with the frame/iframe's name or ID attribute as the argument. For example, you have a frame with the name microsoftFrame and you're loading the Microsoft homepage in it.JavaScriptcd(microsoftFrame); Current window: www.microsoft.com/en-us/default.aspx Important  Note that there were no quotes around the name of the frame. Only pass the unquoted name or ID value as the parameter.To return to the top level window, use cd() with no argument.

    Read the article

  • Weird entry for robots.txt on a Naked Domain in Google Webmaster Tools

    - by Metalshark
    We own a .co.uk address and use an Internet hosting company that has made mistakes around DNS in the past. Our main site is hosted on www. and their reluctance to allow editing of AAAA records on-line means our naked domain does not resolve. Currently when we attempt to reach the naked version there is no entry for the browser to go to and it displays an unreachable page (nslookup just says Name: name of domain with no further entries such as an IP or Canonical Name). We recently added the relevant TXT records to verify us to view both the www. version and the naked version of the domain in Google Webmaster Tools (in anticipation of the requests to our Internet host coming to fruition). Imagine our shock when double checking the Site configuration Crawler access and finding a (admittedly failing) robots.txt with a dynamically generated HTML page (full of crude pop-up JavaScript) with references to 3 of our most prominent competitors. What could cause this to happen? As we are in the UK I am assuming some DNS server is serving Google bad information. We are going to contact the Internet hosting company to fix our A and AAAA records once and for all, then check that they work in the US (using something like OpenDNS). Should we be doing more though, for instance informing Google (through Webmaster Tools) that we are now aware there is something currently wrong with our naked domain? UPDATE: We have fixed our A records (not AAAA) and that has resolved the issue. But if there are further actions we should take for effectively having a parking page hosted on our active visitor-heavy, SEO-rich domain that advertised our competitors to US visitors, what would they be?

    Read the article

  • Google Webmasters Tools strange 404 errors referred from same site

    - by Out of Control
    Starting about a month ago, I noticed a sudden increase in 404 errors in Webmasters Tools for one of my sites (over 1400 errors so far). All the errors are being referred from my own site to non existent pages. The 404 error URLs are all of the same format: URL: http://www.helloneighbour.com/save/1347208508000 The number on the end appears to be a timestamp followed by 3 zeros. The referring page, in this case is : Linked from http://www.helloneighbour.com/save/cmw-insurance-insurance-burnaby When I look at the source code of that page, or I use Webmaster tools to view the page as Google sees it, I can't find any link that comes close to what is above. I built the site, and I can't find any place that might be causing these false links either. The server logs (access and error) don't show Google or anyone else trying to access these links. I've marked all these pages as fixed, and waited a couple of weeks, only to find the errors come back again over the last few days. I'm wondering if anyone else has seen anything strange like this, or if someone might have a way for me to debug, replicate this error myself.

    Read the article

  • Webmaster tools, Duplicate Meta Descriptions, and Short Meta Descriptions [closed]

    - by Watsy91
    Possible Duplicate: Do meta keywords have any impact on ranking algorithms? I am fairly new to the whole Webmaster Tools concept. I have been looking at all the different options, such as crawl errors, HTML improvements etc... I have been looking at the Duplicate Meta Descriptions and Short Meta Descriptions, was wondering if anyone could suggest ideas on how to go about improving this. It seems that all the Duplicates are from the URL Title and the short description. It would seem to me that most people would have information regarding the page with the same keywords as their titles. Heres an example of one: These are the ultimate hampers in taste, quality and value. Amongst this range of luxury hampers ar /food-hampers/food-hampers-over-100.html /thank-you-gifts/large-gifts-over-100.html To get to the point I just want to know do these things really matter? Would they have a real consequence on my sites rankings? My sites have been falling down the rankings since early this year and I have really started to look at Google Analytics and Webmaster tools to try and indicate certain problems. I have researched the Internet and it seems that some people don't bother and others do!! I know that Stackoverflow has 100s+ people who have went through the above and I would really appricate if they could give me some tips etc. Or in the END does it really matter?? :D

    Read the article

  • What GUI tools are available for which DVCS?

    - by Macneil
    When I worked at Sun, we used a DVC system called Forte SCCS/Teamware, which used the old SCCS file format, but was a true distributed source code revision control system. One nice feature is that it had strong GUI support: You could bringover and putback changes by simply clicking and dragging. It would draw trees/graphs showing how workspaces relate to each other. You also could have a graph view to display a single file's complete history, which might have had several branches and merges. Allowing you to compare any two points. It also had a strong visual merge tool, to let you accept changes from one of two conflicting files. Naturally, many of the current DVCSs have command line support for these operations, but I'm looking for GUI support in order to use this in a lower-level undergraduate course I'll be teaching. I'm not saying the Forte Teamware solution was perfect, but it did seem to be ahead of the curve. Unfortunately, it's not a viable option to use for my class. Question: What support do the current DVCSs have with regards to GUIs? Do any of them work on Windows, and not just Linux? Are they "ready for prime-time" or still works in progress? Are these standalone or built as plug-ins, e.g., for Eclipse? Note: To help keep this discussion focused I'm only interested in GUI tools. And not a meta-discussion if GUI tools should be used in teaching.

    Read the article

  • Fake links cause crawl error in Google Webmaster Tools

    - by Itai
    Google reported Crawl Errors last week on my largest site though Webmaster Tools. Here is the message: Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. Investigating these errors and fixing them where appropriate ensures that Google can successfully crawl your site's pages. The Crawl Errors list is now full of hundreds of fake links like these causing 16,519 errors so far: Note that my site does not even have a search.html and is not related to any of the terms shown in the above image. Inspecting sources for one of those links, I can see this is not simply an isolated source but a concerted effort: Each of the links has a few to a dozen sources all from different, seemingly unrelated sites. It is completely baffling as to why would someone to spending effort doing this. What are they hoping to achieve? Is this an attack? Most importantly: Does this have a negative effect on my side? Could it negatively impact my ranking? If so, what to do about it? The few linking pages I looked at are full of thousands of links to tons of sites and have no contact information and do not seem like the kind of people who would simply stop if asked nicely! According to Google Webmaster Tools, these errors have appeared in a span of 11 days. No crawl errors were being reported previously.

    Read the article

  • Web master tools is throwing out 404 errors on link not on page

    - by plantify
    Webmaster tools is showing thousands of 404 errors, where pages on the site are referring to another incorrect url. For example, URL not found www.plantify.co.uk/shop/=, linked from http://www.plantify.co.uk/shop/gift-voucher and http://www.plantify.co.uk/shop/special-plant-offers. I obviously have checked the source and cannot find any references to this link on any page. The only consistent issue is that it only seems to report this error on pages with two section i.e. www.plantify.co.uk/shop does not report any error whilst all pages with www.plantify.co.uk/shop/xxx (where xxx can be several different pages such as gift-voucher) all report this. I cannot seem to duplicate this error. I have run a link checker (we use Screaming Frog) and it does not report this error. I have fetched these pages as a bot, and these do not report this error. I am at a total loss. I cannot even duplicate the issue, but it is most definitely an issue, as Webmaster Tools is reporting new errors every day. Is this perhaps google bot doing its own thing?

    Read the article

  • SQL Server 2012 : The Data Tools installer is now available

    - by AaronBertrand
    Last week when RC0 was released, the updated installer for "Juneau" (SQL Server Data Tools) was not available. Depending on how you tried to get it, you either ended up on a blank search page, or a page offering the CTP3 bits. Important note: the CTP3 Juneau bits are not compatible with SQL Server 2012 RC0. If you already have Visual Studio 2010 installed (meaning Standard/Pro/Premium/Ultimate), you will need to install Service Pack 1 before continuing. You can get to the installer simply by opening...(read more)

    Read the article

  • Tools to simulate mobile devices on a desktop to test websites

    - by Kris
    Are there any good tools that can be run on desktop machines (Windows or Linux) that can simulate a mobile device, preferably with some options as to screen size and mobile browser (user agent if not full render engine). I know it is never going to be perfect (especially without an actual touchscreen), but having a tool on our development machines to do what testing we can that way would be very useful.

    Read the article

  • What non-programming tools do programmers use?

    - by user828584
    I'm reading code complete with the intention of learning how to better structure my code, but I'm also learning a lot about how many aspects of programming something there are that aren't just writing the code. The book talks a lot about problem definition, determining the requirements, defining the structure, designing the code, etc. What tools are used for these non-writing steps of programming? Is there software that will help me design and plan out what I'm going to write before I do?

    Read the article

  • Webmaster Tools: root and subdirectories?

    - by nick
    We have all our international sites on our .com domain like this: site.com/uk site.com/us etc... When creating the sites in Webmaster Tools I've created different sites and submitted sitemaps for each directory so that we can appropriately geotarget the site. Is it also recommended to add the root .com with its geotargeting set to international? If so should I also add all the seperate site maps (like the /us/sitemap.xml) even though they have been added to the directory level sites?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >