Search Results

Search found 2906 results on 117 pages for 'reporting'.

Page 52/117 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • The Best Data Integration for Exadata Comes from Oracle

    - by maria costanzo
    Oracle Data Integrator and Oracle GoldenGate offer unique and optimized data integration solutions for Oracle Exadata. For example, customers that choose to feed their data warehouse or reporting database with near real-time throughout the day, can do so without decreasing  performance or availability of source and target systems. And if you ask why real-time, the short answer is: in today’s fast-paced, always-on world, business decisions need to use more relevant, timely data to be able to act fast and seize opportunities. A longer response to "why real-time" question can be found in a related blog post. If we look at the solution architecture, as shown on the diagram below,  Oracle Data Integrator and Oracle GoldenGate are both uniquely designed to take full advantage of the power of the database and to eliminate unnecessary middle-tier components. Oracle Data Integrator (ODI) is the best bulk data loading solution for Exadata. ODI is the only ETL platform that can leverage the full power of Exadata, integrate directly on the Exadata machine without any additional hardware, and by far provides the simplest setup and fastest overall performance on an Exadata system. We regularly see customers achieving a 5-10 times boost when they move their ETL to ODI on Exadata. For  some companies the performance gain is even much higher. For example a large insurance company did a proof of concept comparing ODI vs a traditional ETL tool (one of the market leaders) on Exadata. The same process that was taking 5hrs and 11 minutes to complete using the competing ETL product took 7 minutes and 20 seconds with ODI. Oracle Data Integrator was 42 times faster than the conventional ETL when running on Exadata.This shows that Oracle's own data integration offering helps you to gain the most out of your Exadata investment with a truly optimized solution. GoldenGate is the best solution for streaming data from heterogeneous sources into Exadata in real time. Oracle GoldenGate can also be used together with Data Integrator for hybrid use cases that also demand non-invasive capture, high-speed real time replication. Oracle GoldenGate enables real-time data feeds from heterogeneous sources non-invasively, and delivers to the staging area on the target Exadata system. ODI runs directly on Exadata to use the database engine power to perform in-database transformations. Enterprise Data Quality is integrated with Oracle Data integrator and enables ODI to load trusted data into the data warehouse tables. Only Oracle can offer all these technical benefits wrapped into a single intelligence data warehouse solution that runs on Exadata. Compared to traditional ETL with add-on CDC this solution offers: §  Non-invasive data capture from heterogeneous sources and avoids any performance impact on source §  No mid-tier; set based transformations use database power §  Mini-batches throughout the day –or- bulk processing nightly which means maximum availability for the DW §  Integrated solution with Enterprise Data Quality enables leveraging trusted data in the data warehouse In addition to Starwood Hotels and Resorts, Morrison Supermarkets, United Kingdom’s fourth-largest food retailer, has seen the power of this solution for their new BI platform and shared their story with us. Morrisons needed to analyze data across a large number of manufacturing, warehousing, retail, and financial applications with the goal to achieve single view into operations for improved customer service. The retailer deployed Oracle GoldenGate and Oracle Data Integrator to bring new data into Oracle Exadata in near real-time and replicate the data into reporting structures within the data warehouse—extending visibility into operations. Using Oracle's data integration offering for Exadata, Morrisons produced financial reports in seconds, rather than minutes, and improved staff productivity and agility. You can read more about Morrison’s success story here and hear from Starwood here. From an Irem Radzik article.

    Read the article

  • Columnstore Case Study #1: MSIT SONAR Aggregations

    - by aspiringgeek
    Preamble This is the first in a series of posts documenting big wins encountered using columnstore indexes in SQL Server 2012 & 2014.  Many of these can be found in this deck along with details such as internals, best practices, caveats, etc.  The purpose of sharing the case studies in this context is to provide an easy-to-consume quick-reference alternative. Why Columnstore? If we’re looking for a subset of columns from one or a few rows, given the right indexes, SQL Server can do a superlative job of providing an answer. If we’re asking a question which by design needs to hit lots of rows—DW, reporting, aggregations, grouping, scans, etc., SQL Server has never had a good mechanism—until columnstore. Columnstore indexes were introduced in SQL Server 2012. However, they're still largely unknown. Some adoption blockers existed; yet columnstore was nonetheless a game changer for many apps.  In SQL Server 2014, potential blockers have been largely removed & they're going to profoundly change the way we interact with our data.  The purpose of this series is to share the performance benefits of columnstore & documenting columnstore is a compelling reason to upgrade to SQL Server 2014. App: MSIT SONAR Aggregations At MSIT, performance & configuration data is captured by SCOM. We archive much of the data in a partitioned data warehouse table in SQL Server 2012 for reporting via an application called SONAR.  By definition, this is a primary use case for columnstore—report queries requiring aggregation over large numbers of rows.  New data is refreshed each night by an automated table partitioning mechanism—a best practices scenario for columnstore. The Win Compared to performance using classic indexing which resulted in the expected query plan selection including partition elimination vs. SQL Server 2012 nonclustered columnstore, query performance increased significantly.  Logical reads were reduced by over a factor of 50; both CPU & duration improved by factors of 20 or more.  Other than creating the columnstore index, no special modifications or tweaks to the app or databases schema were necessary to achieve the performance improvements.  Existing nonclustered indexes were rendered superfluous & were deleted, thus mitigating maintenance challenges such as defragging as well as conserving disk capacity. Details The table provides the raw data & summarizes the performance deltas. Logical Reads (8K pages) CPU (ms) Durn (ms) Columnstore 160,323 20,360 9,786 Conventional Table & Indexes 9,053,423 549,608 193,903 ? x56 x27 x20 The charts provide additional perspective of this data.  "Conventional vs. Columnstore Metrics" document the raw data.  Note on this linear display the magnitude of the conventional index performance vs. columnstore.  The “Metrics (?)” chart expresses these values as a ratio. Summary For DW, reports, & other BI workloads, columnstore often provides significant performance enhancements relative to conventional indexing.  I have documented here, the first in a series of reports on columnstore implementations, results from an initial implementation at MSIT in which logical reads were reduced by over a factor of 50; both CPU & duration improved by factors of 20 or more.  Subsequent features in this series document performance enhancements that are even more significant. 

    Read the article

  • How to deal with overly aggressive "Link Take Down Demands"?

    - by Eoin
    I've been receiving a large number of emails recently requesting I clean from link spam from my forum. Initially the emails were very polite and professional, and I was happy to remove the links. Recently the email have gotten very abrasive, here is a particularly rude example: From: [email protected] To: [email protected] Hi, This is the second time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We really do need to remove this link. We have to report to Google any link we were unable to remove, and I wouldn't want to have to include your site in the list. Could you please remove our link from this page and any other page on your site? Thank You, Name Changed Behind the superficial pleasantries I feel there is some very real maliciousness. Note the email address, DMCA Violations, I don't see how the DMCA is involved here, except as a word which tends to strike fear in many people. Also relating to the email address, it doesn't match the company being linked to at all. How am I to trust they are truely operating on behalf of company-two when they don't even use one of it's email addresses. My email is hidden by privacypost. While a service with legitimate uses, I feel it's highly unprofessional for communications between to companies. The claim "This is the second time..." Every email I've received has started like this, but a check of my spam filters has never revealed a 1st mail. Initially I gave them the benefit of the doubt, by now though it's clear this is a cheap ploy to start me off on the defensive. And finally worst of all- the threats of reporting me to Google if I don't do everything they ask. I sent a polite reply asking for more information. I have no idea if the email address was even valid but I never received any response. Much later I got this followup mail From: [email protected] To: [email protected] Hi, This is the final time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We will soon be reporting to Google any link we were unable to remove, and currently your site will have to be on the list. Could you please remove our link from this page and any other page on your site? I appreciate your urgent attention to this matter. Thank You, Name Changed This time the from address was more personal, though still not obviously connected to the spammed company. Lets be honest, I don't for one second believe that the companies were the victim of a 3rd party spammer as they claim. The links in questions were generated well over a year ago, and I firmly believe the companies were directly responsible for the spam links in question, a type of spam that has plagued my forum. Now they have the audacity to demand I spend my time cleaning up their mess, using threats to ensure they get their way. Have recent changes in Googles algorithms meant all the cash they spent spamming the web has now turned into a liability? If so I can see why these companies are all of a sudden running scared. Frankly, cleaning up my forum is a good things, but the threats they are using sickens me. So my question here is specifically about the threats: Are they vaild, and would such reports to Google destroy my page rankings? Is there a way I can report this abusive behaviour to Google?

    Read the article

  • Oracle Executive Strategy Brief: Enterprise-Grade Cloud Applications

    - by B Shashikumar
    Cloud Computing has clearly evolved into one of the dominant secular trends in the industry. Organizations are looking to the cloud to change how they buy and consume IT. And its no longer about just lower up-front costs. The cloud promises to deliver greater agility and free up resources to focus on innovation versus running and maintaining systems. But are organizations actually realizing these benefits? The full promise of cloud is not being realized by customers who entrust their business to multiple niche cloud providers. While almost 9 out of 10 companies  expect more IT agility with cloud, only 47% are actually getting it (Source: 2011 State of Cloud Survey by Symantec). These niche cloud customers have also seen the promises of lower costs, efficiency gains, improved security, and compliance go unfulfilled. Having one cloud provider for customer relationship management (CRM) and another for human capital management (HCM), and then trying to glue these proprietary systems together while integrating to a back-office financial system can add to complexity and long-term costs. Completing a business process or generating an integrated report is cumbersome, and leverages incomplete data. Why can’t niche cloud providers deliver on the full promise of cloud? It’s simple: you still need to complete business processes. You still need reporting that enables you to take action using data from multiple systems. You still have to comply with SOX and other industry regulations. These requirements don’t go away just because you deploy in the cloud. Delivering lower up-front costs by enabling customers to buy software as a service (SaaS) is the easy part. To get real value that lasts longer than your quarterly report, it’s important to realize the benefits of cloud without compromising on functionality and while having the right level of control and flexibility. This is the true promise of cloud. Oracle’s cloud strategy centers around delivering the benefits of cloud—without compromise. We uniquely empower our customers with complete solutions and choice. From the richest functionality to integrated reporting and great user experience. It’s all available in the cloud. And it works not just with other Oracle cloud applications, but with your existing Oracle and third-party systems as well. This helps protect your current investments and extend their value as you journey to the cloud. We’ve made the necessary investments not only in our applications but also in the underlying technology that makes it all run—from the platform down to the hardware and operating system. We make it all. And we’ve engineered it to work together and be highly optimized for our customers, in the cloud. With Oracle enterprise-grade cloud applications, you get the benefits of cloud plus more power, more choice, and more confidence. Read more about how you can realize the true advantage of Cloud with Oracle Enterprise-grade Cloud applications in the Oracle Executive Strategy Brief here.  You can also attend an Oracle Cloud Conference event at a city near you. Register here. 

    Read the article

  • WinForms ReportViewer: slow initial rendering

    - by Bryan Roth
    UPDATE 2.4.2010 Yeah, this is an old question but I thought I would give an update. So, I'm working with the ReportViewer again and it's still rendering slowly on the initial load. The only difference is that the SQL database is on the reporting server. UPDATE 3.16.2009 I have done profiling and it's not the SQL that is making the ReportViewer render slowly on the first call. On the first call, the ReportViewer control locks up the UI thread and makes the program unresponsive. After about 5 seconds the ReportViewer will unlock the UI thread and display "Report is being generated" and then finally show the report. I know 5 seconds is not much but this shouldn't be happening. My coworker does the same thing in a program of his and the ReportViewer immediately displays the "Report is being generated" upon any request. The only difference is that the reporting server is on one server and the data is on another server. However, when I am developing the reports within SSRS, there is no delay. UPDATE I have noticed that only the first load of the ReportViewer takes a long time; each subsequent load of the same or different reports loads fast. I have a WinForms ReportViewer that I'm using in Remote processing mode that can take up to 30 seconds to render when the ReportViewer.RefreshReport() method is called. However, the report itself runs fast. This is the code to setup my ReportViewer: rvReport.ProcessingMode = ProcessingMode.Remote rvReport.ShowParameterPrompts = False rvReport.ServerReport.ReportServerUrl = New Uri(_reportServerURL) rvReport.ServerReport.ReportPath = _reportPath This is where the ReportViewer can take up to 30 seconds to render: rvReport.RefreshReport()

    Read the article

  • RotatingFileHandler throws an exception when delay parameter is set

    - by Eli Courtwright
    When I run the following code under Python 2.6 import logging from logging.handlers import RotatingFileHandler rfh = RotatingFileHandler("testing.log", delay=True) logging.getLogger().addHandler(rfh) logging.warning("Boo!") then the last line throws AttributeError: RotatingFileHandler instance has no attribute 'level'. So I add the line rfh.setLevel(logging.DEBUG) before the call to addHandler, and then the last line throws AttributeError: RotatingFileHandler instance has no attribute 'filters'. So if I manually set filters to be an empty list, then it complains about not having the attribute lock, etc. When I remove the delay=True to leave it as the default value of False as documented here, the problem completely goes away. Am I missing something? How do I properly use the delay parameter of the RotatingFileHandler class? EDIT: Upon further analysis (presented in my own answer below), this looks like a bug, but I can't find a bug report on this in the Python bug tracker, even trying different search terms, so I guess I'll report it. However, if someone can locate the actual bug report, then I can avoid submitting a duplicate reporting and wasting the time of the Python developers. I'll hold off on reporting the bug for a few hours, and if someone posts an answer that has the current bug report, then I'll accept that answer for this question.

    Read the article

  • Help/Questions About New Team Foundation Server 2010 Installation

    - by user579218
    Hello. Before starting down the TFS2010 installation process, I have a few questions I'm hoping the community can help me with. We're planning on a single-server installation of TFS2010. Initially, we want version/source control and build services, but not reporting or SharePoint. We may add reporting and SharePoint capabilities later. Our environment will be Windows Server 2008 R2 (x64), SQL Server 2008 R2 (x64), Office 2010 (x86), Visual Studio 6 and 2010, and, of course, Team Foundation Server 2010. Can I install TFS2010 on a server that is on our domain? It's not a domain controller, it's just a member server on the domain. Should I install TFS2010 before or after putting the server on the domain? We have six developers that will be logging into their local development computers (which are also on the same domain) using their domain user accounts, do I add each domain user to the TFS2010 server's security groups? If so, which one(s)? Can I or should I use a domain user account as the TFS2010 service account? Or, should I just use Network Service? The TFS2010 install guide notes that none of the service accounts should belong to the Administrators security group, so which security group(s) are recommended for the service account(s)? We're planning on using a local instance of SQL Server 2008 R2 Standard with TFS2010, what service account should we use? Should we use the same domain account as TFS2010 or Local System or ?? The TFS2010 install guide isn't very specific on this. Since we're planning on this server being both the version/source control and build server, should we install our development environments (VS6, VS2010, Access2010) before installing TFS2010? Or does it matter? Thanks in advance for answering these questions.

    Read the article

  • PHP Mystery Theatre - HTML Element doesn't display when served with php5, restart server with php4 a

    - by togglemedia
    Get this...I start the server in php5 and a specific HTML element ('a' element with a background image) is nowhere to be seen, I reboot the server in php4 and the HTML element is displaying properly. I boot back and forth between php5 and php4 with absolute consistent results, not displaying in php5 and displaying in php4. The thing that blows my mind is that: the HTML is consistent between boots of php5/4, so both scenarios have the necessary HTML elements the CSS is consistent between boots of php5/4, so both scenarios have the necessary CSS definitions there is an identically styled sibling element, with different class name that displays properly the problem is reproducible between browsers, between platforms. I've tested it on every possible config, Mac/Windows, IE6,7,8, FireFox, Safari etc... When the server is booted in php5 there is one HTML element that just doesn't display/render. A stab in the dark, I turned on PHP error reporting in php5 (to see if there would be any clues there) and low and behold the HTML element is now rendering in php5. I turn off PHP error reporting, and restart php5 and the HTML element is still rendering in php5, and the problem has been fixed...and I can't get the problem to reproduce. This is why my curiosity has brought me here. I just spent about four hours scratching my head trying to figure out how this could be. And now, I ask any php web dev gurus out there...what the heck was this all about?

    Read the article

  • extracting RDL data using LINQ

    - by BobC
    I'm working with some SQL Report definition files (RDLs), using LINQ to extract component query statements for validation. I'm trying to extract the <DataSet> elements from under the <DataSets> element. I seem to be getting hung up with one of the elements under <DataSet><Fields><Field> which has a namespace qualifier <rd:TypeName> I've been using LINQ to XML for other parts of the files where there is no namespace qualifiers with no trouble, by specifying a default namespace. The RDL specifies two namespaces: xmlns="http://schemas.microsoft.com/sqlserver/reporting/2005/01/reportdefinition" xmlns:rd="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner"> When I try to get the <DataSets> element, however, I get the following error: System.Xml.XmlException - The ':' character, hexadecimal value 0x3A, cannot be included in a name. I know it has to do with the namespace qualifier (rd:) in one of the child elements, but I'm having difficulty getting a LINQ expression that works. Any help would be appreciated. Thanks!

    Read the article

  • Sanity check: UIBarButtonItem crashes trying to perform action

    - by Giao
    One of my users is reporting a crash on his device, an iPhone 3GS. Other devices of the same type are not reporting similar behavior. He's sent me a crash log and based on reading it, I'm not sure how to proceed. I hope I'm not interpreting the crash log incorrectly but it doesn't look like my action has been called yet. This is how I'm creating and setting up the UIBarButtonItem: - (void)viewDidLoad { [super viewDidLoad]; UIBarButtonItem *addButton = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemAdd target:self action:@selector(addLog:)]; self.navigationItem.rightBarButtonItem = addButton; [addButton release]; } This is the my action method: - (IBAction)addLog:(id)sender { MyViewController *myController = [[MyViewController alloc] initWithNibName:@"MyNib" bundle:nil]; UINavigationController *subNavigationController = [[UINavigationController alloc] initWithRootViewController: myController]; [self presentModalViewController:subNavigationController animated:YES]; [myController release]; [subNavigationController release]; } This is the crash log: Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x00000000, 0x00000000Crashed Thread: 0 Thread 0 Crashed: 0 libSystem.B.dylib 0x0007e98c __kill + 81 libSystem.B.dylib 0x0007e97c kill + 4 2 libSystem.B.dylib 0x0007e96e raise + 10 3 libSystem.B.dylib 0x0009361a abort + 34 4 MyApp 0x000042e8 0x1000 + 13032 5 CoreFoundation 0x00058ede -[NSObject performSelector:withObject:withObject:] + 18 6 UIKit 0x0004205e -[UIApplication sendAction:to:from:forEvent:] + 78 7 UIKit 0x00094d4e -[UIBarButtonItem(Internal) _sendAction:withEvent:] + 86 8 CoreFoundation 0x00058ede -[NSObject performSelector:withObject:withObject:] + 18 9 UIKit 0x0004205e -[UIApplication sendAction:to:from:forEvent:] + 78 10 UIKit 0x00041ffe -[UIApplication sendAction:toTarget:fromSender:forEvent:] + 26 11 UIKit 0x00041fd0 -[UIControl sendAction:to:forEvent:] + 32 12 UIKit 0x00041d2a -[UIControl(Internal) _sendActionsForEvents:withEvent:] + 350 13 UIKit 0x0004263e -[UIControl touchesEnded:withEvent:] + 330 14 UIKit 0x00041656 -[UIWindow _sendTouchesForEvent:] + 318 15 UIKit 0x00041032 -[UIWindow sendEvent:] + 74

    Read the article

  • Problem with DOMPDF and Kohana 3

    - by alex
    I have used DOMPDF many times before successfully, however outside of the Kohana Framework. I created a module for DOMPDF and called it simply pdf. Here is it's code. class Pdf { private $domPdfInstance; public function __construct($html) { if ( ! class_exists('DOMPDF', FALSE)) { // Load DOMPDF require Kohana::find_file('vendor', 'dompdf/dompdf_config.inc'); } $this->domPdfInstance = new DOMPDF; $this->domPdfInstance->load_html($html); } public function render() { $this->domPdfInstance->render(); return $this; } public function stream($filename) { if (pathinfo($filename, PATHINFO_EXTENSION) !== 'pdf') { $filename .= '.pdf'; } $this->domPdfInstance->stream($filename); return $this; } } For some reason, any PDF I generate using this results in it being corrupt (not opening under Mac OS X's Preview app at least). I have even tried this basic HTML <html> <head> <title>PDF</title> </head> <body> please work </body> </html> Is this a known problem? I have not touched anything in dompdf_config.inc.php except uncomment out the error reporting levels so they now read error_reporting(E_STRICT | E_ALL); PHP is not reporting any errors. Here is a screenshot of the error from Preview (though I think it's probably unhelpful) Some thing I have thought may be a problem Auto loading conflicts Base path conflicts

    Read the article

  • Progress Bar design patterns?

    - by shoosh
    The application I'm writing performs a length algorithm which usually takes a few minutes to finish. During this time I'd like to show the user a progress bar which indicates how much of the algorithm is done as precisely as possible. The algorithm is divided into several steps, each with its own typical timing. For instance- initialization (500 milli-sec) reading inputs (5 sec) step 1 (30 sec) step 2 (3 minutes) writing outputs (7 sec) shutting down (10 milli-sec) Each step can report its progress quite easily by setting the range its working on, say [0 to 150] and then reporting the value it completed in its main loop. What I currently have set up is a scheme of nested progress monitors which form a sort of implicit tree of progress reporting. All progress monitors inherit from an interface IProgressMonitor: class IProgressMonitor { public: void setRange(int from, int to) = 0; void setValue(int v) = 0; }; The root of the tree is the ProgressMonitor which is connected to the actual GUI interface: class GUIBarProgressMonitor : public IProgressMonitor { GUIBarProgressMonitor(ProgressBarWidget *); }; Any other node in the tree are monitors which take control of a piece of the parent progress: class SubProgressMonitor : public IProgressMonitor { SubProgressMonitor(IProgressMonitor *parent, int parentFrom, int parentLength) ... }; A SubProgressMonitor takes control of the range [parentFrom, parentFrom+parentLength] of its parent. With this scheme I am able to statically divide the top level progress according to the expected relative portion of each step in the global timing. Each step can then be further subdivided into pieces etc' The main disadvantage of this is that the division is static and it gets painful to make changes according to variables which are discovered at run time. So the question: are there any known design patterns for progress monitoring which solve this issue?

    Read the article

  • Calculate time of method execution and send to WCF service async

    - by Tim
    I need to implement time calculation for repository methods in my asp .net mvc project classes. The problem is that i need to send time calculation data to WCF Service which is time consuming. I think about threads which can help to cal WCF service asynchronously. But I have very little experience with it. Do I need to create new thread each time or I can create a global thread, if so then how? I have something like that: StopWatch class public class StopWatch { private DateTime _startTime; private DateTime _endTime; public void Start() { _startTime = DateTime.Now; } protected void StopTimerAndWriteStatistics() { _endTime = DateTime.Now; TimeSpan timeResult = _endTime - _startTime; //WCF proxy object var reporting = AppServerUtility.GetProxy<IReporting>(); //Send data to server reporting.WriteStatistics(_startTime, _endTime, timeResult, "some information"); } public void Stop() { //Here is the thread I have question with var thread = new Thread(StopTimerAndWriteStatistics); thread.Start(); } } Using of StopWatch class in Repository public class SomeRepository { public List<ObjectInfo> List() { StopWatch sw = new StopWatch(); sw.Start(); //performing long time operation sw.Stop(); } } What am I doing wrong with threads?

    Read the article

  • Repackaging Jasper-Reports into an application specific OSGi bundle, legal or not?

    - by Chris
    Hi, I wanted to ask (probably a silly) question regarding the packaging of existing open-source components as OSGi bundles (more specifically Jasper Reports). I have an application that I am converting from a monolithic jar-hell type architecture to something more moduler and OSGi is my weapon of choice. There are various modules I have in mind but one of the modules is a reporting module. My own reporting module will be a jar file containing my code that should reference a Jasper Reports bundle. Trouble is, Jasper reports depends on far far too many libraries and is quite monolithic in its own right. I therefore wish to build my own Jasper Reports bundle but this is where I start getting confused about the legality of repackaging. I don't plan to re-compile but I do plan to re-bundle removing known items that I do not require. Can anyone offer advice on whether I am permitted to repackage (not recompile or extend) open-source libraries into OSGi bundles without falling foul of 'derivative works' clause of LGPL? I noticed that Groovy seems to offer some monolithic jars that include all dependancies and actually goes so far as to re-arrange the packages of its dependancies so that there are no namespace conflicts. This seems to me to be a violation of the license but if anyone can reassure me that this is legal then I would feel safer about my less intrusive custom-bundling of Jasper reports. Thanks for your time, Chris

    Read the article

  • HTML Element doesn't display when served with php5, restart server with php4 and Element displays.

    - by togglemedia
    Get this...I start the server in php5 and a specific HTML element ('a' element with a background image) is nowhere to be seen, I reboot the server in php4 and the HTML element is displaying properly. I boot back and forth between php5 and php4 with absolute consistent results, not displaying in php5 and displaying in php4. The thing that blows my mind is that: the HTML is consistent between boots of php5/4, so both scenarios have the necessary HTML elements the CSS is consistent between boots of php5/4, so both scenarios have the necessary CSS definitions there is an identically styled sibling element, with different class name that displays properly the problem is reproducible between browsers, between platforms. I've tested it on every possible config, Mac/Windows, IE6,7,8, FireFox, Safari etc... When the server is booted in php5 there is one HTML element that just doesn't display/render. A stab in the dark, I turned on PHP error reporting in php5 (to see if there would be any clues there) and low and behold the HTML element is now rendering in php5. I turn off PHP error reporting, and restart php5 and the HTML element is still rendering in php5, and the problem has been fixed...and I can't get the problem to reproduce. This is why my curiosity has brought me here. I just spent about four hours scratching my head trying to figure out how this could be. And now, I ask any php web dev gurus out there...what the heck was this all about?

    Read the article

  • SSRS2005 timeout error

    - by jaspernygaard
    Hi I've been running around circles the last 2 days, trying to figure a problem in our customers live environment. I figured I might as well post it here, since google gave me very limited information on the error message (5 results to be exact). The error boils down to a timeout when requesting a certain report in SSRS2005, when a certain parameter is used. The deployment scenario is: Machine #1 Running reporting services (SQL2005, W2K3, IIS6) Machine #2 Running datawarehouse database (SQL2005, W2K3) which is the data source for #1 Both machines are running on the same vm cluster and LAN. The report requests a fairly simple SP - lets called it sp(param $a, param $b). When requested with param $a filled, it executes correctly. When using param $b, it times out after the global timeout periode has passed. If I run the stored procedure with param $b directly from sql management studio on #2, it returns the results perfectly fine (within 3-4s). I've profiled the datawarehouse database on #2 and when param $b is used, the query from the reporting service to the database, never reaches #2. The error message that I get upon timeout, when using param $b, when invoking the report directly from SSRS web interface is: "An error has occurred during report processing. Cannot read the next data row for the data set DataSet. A severe error occurred on the current command. The results, if any, should be discarded. Operation cancelled by user." The ExecutionLog for the SSRS does give me much information besides the error message rsProcessingAborted I'm running out of ideas of how to nail this problem. So I would greatly appreciate any comments, suggestions or ideas. Thanks in advance!

    Read the article

  • Oracle why does creating trigger fail when there is a field called timestamp?

    - by Omar Kooheji
    I've just wasted the past two hours of my life trying to create a table with an auto incrementing primary key bases on this tutorial, The tutorial is great the issue I've been encountering is that the Create Target fails if I have a column which is a timestamp and a table that is called timestamp in the same table... Why doesn't oracle flag this as being an issue when I create the table? Here is the Sequence of commands I enter: Creating the Table: CREATE TABLE myTable (id NUMBER PRIMARY KEY, field1 TIMESTAMP(6), timeStamp NUMBER, ); Creating the Sequence: CREATE SEQUENCE test_sequence START WITH 1 INCREMENT BY 1; Creating the trigger: CREATE OR REPLACE TRIGGER test_trigger BEFORE INSERT ON myTable REFERENCING NEW AS NEW FOR EACH ROW BEGIN SELECT test_sequence.nextval INTO :NEW.ID FROM dual; END; / Here is the error message I get: ORA-06552: PL/SQL: Compilation unit analysis terminated ORA-06553: PLS-320: the declaration of the type of this expression is incomplete or malformed Any combination that does not have the two lines with a the word "timestamp" in them works fine. I would have thought the syntax would be enough to differentiate between the keyword and a column name. As I've said I don't understand why the table is created fine but oracle falls over when I try to create the trigger... CLARIFICATION I know that the issue is that there is a column called timestamp which may or may not be a keyword. MY issue is why it barfed when I tried to create a trigger and not when I created the table, I would have at least expected a warning. That said having used Oracle for a few hours, it seems a lot less verbose in it's error reporting, Maybe just because I'm using the express version though. If this is a bug in Oracle how would one who doesn't have a support contract go about reporting it? I'm just playing around with the express version because I have to migrate some code from MySQL to Oracle.

    Read the article

  • Customizing the TFS 2008 build sequence to avoid compilation and deploy SSRS

    - by Andrew
    I'm trying to create a CI process for SQL Server Reporting Services. I am fairly new to TFS but quite experienced with MSBuild. In the past I've used a combination of MSBuild with Team City so the whole build process is more or less custom. Here lies the start of my problems, as the solution I am deploying only contains Report Server projects (rds), no compilation is required. I thought that I would override the the first default task that TFS runs (EndToEndIteration) to override the default TFS build sequence and inject my own. The first snag that I have come across is that the build always fails, how can I set the status of the build to success? Currently the EndToEndIteration task is very light and only has a message. Is this the best method to create a custom build process in TFS where compilation is not required? Or should I use the default sequence and override one of the hook tasks mentioned in http://msdn.microsoft.com/en-us/library/aa337604%28VS.80%29.aspx (ie: AfterCompile) The core steps that I'd like to achieve are: Bundle the RDL and datasource files Connect to the host server to register/deploy the reports Re-apply any subscriptions that previously existed Run tests to verify the deployment succeeded and is returning results as expected I have found another article on Report services deployment: http://stackoverflow.com/questions/88710/reporting-services-deployment But it doesn't mention the best practice for customizing the standard build process. Any help would be appreciated.

    Read the article

  • Every flash uploader giving bad progress values.

    - by Mike Boers
    The file upload script I wrote early last year for an internal website has been misbehaving oddly on a number of machines. On some machines it consistently works fine, on others it consistently misbehaves. I am having exactly the same problem with YUI Uploader, SWFUpload (2.2 and 2.5a), and Uploadify. On the misbehaving machines, the progress event (or callback as the case may be) is reporting the upload going far too quickly. It is progressing around 9 or 10MB/s, instead of the 50 or 60kb/s that is actually going on. The progress bar fills up very quickly, and then no more progress events are triggered. A few minutes later the completion event will trigger when the upload is actually done. I must emphasize that the file upload does proceed normally, even though the progress being reported is very wrong. The progress events are reporting a correct file size, but the reported amount uploaded is usually way too high, and it appears that it is always a multiple of 2^16 (65536). I'm only having this problem with Firefox 3.5 on Windows XP, all of which have various subversions of Flash 10. Has anyone heard of this happening, or have any idea what is going on? (I'm off to go file a number of bug reports, but hopefully someone here has some previous experience with this.)

    Read the article

  • UITableViewIndex Grows to Incorrect Bounds Unexpectedly

    - by chadburggraf
    I have a standard UITableView + UISearchDisplayController + UITableViewIndex setup. Everything works like a champ. Except, under very specific conditions, the index grows too long to display on the screen. Specifically, after ending a search and re-displaying the unfiltered, indexed table, the index sometimes grows too long. More specifically, this doesn't happen if I search then cancel. It only happens if I search, then push a view controller from the search table, then pop that view controller back to the still-searching table, then cancel the search, then re-search and then cancel that final search. After the end of the final search the index is too long. In portrait, the table view is reporting a height of 416 and the index a height of 404 under normal conditions. If I log from searchDisplayControllerDidEndSearch when the index is sized incorrectly, it is reporting a height of 620. I've tried everything from setLayout on the table and the index to manually re-sizing the frame. Nothing works (the manual re-size causes the correct height to be logged, but it doesn't change the display on screen). I was about to try re-sizing after a delay in case the cancel animation was interfering, but then I realized what an absurd situation I'm in and thought seeking help might be wise...

    Read the article

  • "rsAccessDenied" error for SSRS 2008

    - by JackLocke
    Hi All, I have been trying to access SSRS Web Service URL hxxp://myServer:80/ReportServer (from Reporting Service Configuration Manager), but my IE always shows "rsAccessDenied" message saying that my account doesn't have privilage required to view. Here are my system specs. Its my laptop with Windows 7 x64, and SQL Server 2008 with SP1 and I am using Mixed Mode Authentication with My account as SysAdmin privilages and this is what I have been trying / tried ... (ofcourse with restarting the service everytime I make any change in configuration), I changed service account from Reporting Service Configuration Manager to make it use My account but nothing happend. I tried running my IE as admin, by RUN AS ADMIN but still same message. Then I read somewhere I have to delete/recreate my encryption keys as well, so I tried again with that, then it was asking me to enter ID/PWD to access server here I am totally blank because it was not accepting my account credentials !!!. Weird thing is I can see my existing reports if I follow this URL hxxp://myServer:80/Reports , for which My guess is solely used to view reports. I have read post here about kind of same problem, but it seems that OP just left forum after asking question... Also, MSDN does have these helps hxxp://msdn.microsoft.com/en-us/library/ms156034.aspx hxxp://msdn.microsoft.com/en-us/library/bb630430.aspx but both of this didn't workout for me. I will really appriciate it if any one can help me out. Jack p.s. I was not allowed to post more than 1 URL because of my "reputation" so I had to change the string a bit. Please replace hxxp wih http in URLs.

    Read the article

  • The perfect web UI framework (with Microsoft stack?) - architecture question?

    - by Igorek
    I'm looking for suggestions for the following issue, and I realize there is really not going to be a perfect answer to my question: I have a UI built in WinForms.NET (v4.0 framework) with WCF back-end and EF4 model objects, that I am looking to port to the web. UI is not huge and is not super complex and is structured well. But it is not a super simple system either. I am looking to pick a technology stack for the web-frontend that will target desktop & partially mobile platforms, provide a good development platform to build on, and facilitate code reuse across UI and back-end tiers... I would rather avoid: custom coding of UI-centric scripts, because they are hard to debug, non-compiled, usually a maintenance nightmare, almost always start to contain business logic, and duplicate some of the logic that back-end tiers have (especially validation) custom-coding for Desktop Web and Mobile Web UI's separately (although I realize that mobile web UI will likely contain fewer of data-entry screens and more reporting screens) non-.NET technology stacks I would love to: target the reporting capabilities of the system toward mobile web browsers not have to write a single line of script (javascript, jquery, etc.) utilize a good collection of controls that produces an elegant UI use .NET for everything The way I see it right now, I need to re-write this app in Silverlight, utilize a 3rd party UI framework like Telerik, and re-do the reports UI again for mobile platforms separately. However, I'm rather concerned about the shelf-life of Silverlight and the needed to deploy a different architecture to deal with mobile platform. Is there an ASP.NET/MVC/Ajax architecture/framework/library that would allow me to get at the power of .NET and without painful (imho) client-side scripting, while providing a decent user experience Thank you

    Read the article

  • Report Viewer Configuration error - In View Source of webpage

    - by sarada
    I found the following error message when I checked View source of the web page , but the web page works just fine. Our Test lead found the error while performing Assertion tests. Report Viewer Configuration Error The Report Viewer Web Control HTTP Handler has not been registered in the application's web.config file. Add <add verb=" * " path="Reserved.ReportViewerWebControl.axd" type = "Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" /> to the system.web/httpHandlers section of the web.config file, or add <add name="ReportViewerWebControlHandler" preCondition="integratedMode" verb="*" path="Reserved.ReportViewerWebControl.axd" type="Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" /> to the system.webServer/handlers section for Internet Information Services 7 or later Why is this error message coming up in view source.. Note:There is a div tag around this error message which has style="display:none" I am trying to find out why but everybody has only discussed this error message as one which is thrown in the webpage. The changes suggested to web.config are already present in our config file.

    Read the article

  • Best way to migrate export/import from SQL Server to oracle

    - by matao
    Hi guys! I'm faced with needing access for reporting to some data that lives in Oracle and other data that lives in a SQL Server 2000 database. For various reasons these live on different sides of a firewall. Now we're looking at doing an export/import from sql server to oracle and I'd like some advice on the best way to go about it... The procedure will need to be fully automated and run nightly, so that excludes using the SQL developer tools. I also can't make a live link between databases from our (oracle) side as the firewall is in the way. The data needs to be transformed in the process from a star schema to a de-normalised table ready for reporting. What I'm thinking about is writing a monster query for SQL Server (which I mostly have already) that will denormalise and read out the data from SQL Server into a flat file using the sql server equivalent of sqlplus as a scheduled task, dump into a Well Known Location, then on the oracle side have a cron job that copies down the file and loads it with sql loader and rebuilds indexes etc. This is all doable, but very manual. Is there one or a combination of FOSS or standard oracle/SQL Server tools that could automate this for me? the Irreducible complexity is the query on one side and building indexes on the other, but I would love to not have to write the CSV dumping detail or the SQL loader script, just say dump this view out to CSV on one side, and on the other truncate and insert into this table from CSV and not worry about mapping column names and all other arcane sqlldr voodoo... best practices? thoughts? comments? edit: I have about 50+ columns all of varying types and lengths in my dataset, which is why I'd prefer to not have to write out how to generate and map each single column...

    Read the article

  • Database for managing large volumes of (system) metrics

    - by symcbean
    Hi, I'm looking at building a system for managing and reporting stats on web page performance. I'll be collecting a lot more stats than are available in the standard log formats (approx 20 metrics) but compared to most types of database applications, the base data structure will be very simple. My problem is that I'll be accumulating a lot of data - in the region of 100,000 records (i.e. sets of metrics) per hour. Of course, resources are very limited! So that its possible to sensibly interact with the data, I'd need to consolidate each metric into one minute bins, broken down by URL, then for anything more than 1 day old, consolidated into 10 minute bins, then at 1 week, hourly bins. At the front end, I want to provide a view (prefereably as plots) of the last hour of data, with the facility for users to drill up/down through defined hierarchies of URLs (which do not always map directly to the hierarchy expressed in the path of the URL) and to view different time frames. Rather than coding all this myself and using a relational database, I was wondering if there were tools available which would facilitate both the management of the data and the reporting. I had a look at Mondrian however I can't see from the documentation I've looked at whether it's possible to drop the more granular information while maintaining the consolidated views of the data. RRDTool looks promising in terms of managing the data consolidation, but seems to be rather limited in terms of querying the dataset as a multi-dimensional/relational database. What else whould I be looking at?

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >