Search Results

Search found 3729 results on 150 pages for 'sqlserver reporting servi'.

Page 84/150 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • Selenium-Nunit Program Structure

    - by Jacobm001
    My office has a suite of web reporting engines written in VB. All in all there's about 300 reports with varying displays depending on the data being input into them. I'm trying to establish an efficient way to deal with such a major diversity, but am struggling with creating a system that won't be a nightmare to code/maintain. What I've considered doing is: On program launch, read the steps required for each test page. This may have multiple tests for the same page with varying inputs. Write each iteration of the test in XML file under $env:temp/testname Use the TestCaseSource attribute of Nunit to funnel every related xml file as a source. My major stumbling block has been how to get that data to the Nunit framework. Is Nunit really appropriate for what I'm trying to do, or is it too static?

    Read the article

  • Easy QueryBuilder - A User-Friendly Ad-Hoc Advanced Search Solution

    Constructing an easy and powerful QueryBuilder interface becomes more important for complex data grid filtering and accurate reporting services. In this article, I'll discuss how to build a query search engine using ASP.NET AJAX and dynamic SQL. The main goal is to provide an interactive interface to allow users select query attributes, operators, attribute values, and T-SQL operators so that the data context query list can be easily composed and a search engine is invoked.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Consolidating SQL Server Error Logs from Multiple Instances Using SSIS

    SQL Server hides a lot of very useful information in its error log files. Unfortunately, the process of hunting through all these logs, file-by-file, server-by-server, can cause a problem. Rodney Landrum offers a solution which will allow you to pull error log records from multiple servers into a central database, for analysis and reporting with T-SQL....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • TFS Backup Plan Wizard Tool

    - by Enrique Lima
    With the release of the “September – 2010” TFS 2010 Power Tools, came an addition to the Team Foundation Server Administration Console.  This addition is the Team Foundation Backups Tree item.  The tool is used to create backup plans and to work with it you run through a wizard, just like you would in configuring TFS or any of the extensions it has. The areas covered through the tool include: Backup to a Network Backup Path, retention configuration. Under Advanced Options, the extension to be used for the Full and Transactional backups. The capability to include external databases, meaning, include the reporting databases and SharePoint databases as part of the plan. There are further options as you can see, that includes being able to define a task scheduler account, be able to set alerts for notifications on execution of the plans, and last the option to configure the schedule for the plan execution.  All in all a very good tool and great way to safeguard the investment you’ve made.

    Read the article

  • Well supported Hardware Raid Controller

    - by ftiaronsem
    I am currently planning to buy a hardware-raid controller. This became necessary since I am running Linux and Windows in parallel and now need the redundancy for both OS (Im am going to use RAID1 / Mirroring). Therefore I am searching for a hardware raid controller which is well supported by linux / ubuntu (reporting smart values, stats for the harddrives, etc...). This controller should have four sata ports and if possible it should fit in a PCIE-1x Slot. I would greatly appreciate, if you could suggest some devices. Thanks in advance

    Read the article

  • Disqus thread migration. Gotchas?

    - by sramsay
    I've been migrating a site to a new domain. The site itself is pretty straightforward (it uses Jekyll), and everything has gone fine -- except migration of Disqus threads. I've had partial success -- some of the threads have migrated successfully, but not all. I've tried the domain migration wizard (which caught a few), the URL mapper (which caught a few), and the 301 redirect crawler (which caught a few). But the remaining threads just won't move, no matter which method I use. So, I suppose I suppose I'm asking if there are any "gotchas" I should know about with this. When you execute any of these migration tools, it says it will "take awhile." Does that mean hours? Days? I can't tell if it's working, and there's no logging or error reporting that I can see.

    Read the article

  • Stop Google Analytics from appending hostname?

    - by Nick Q.
    I've come across an Analytics profile that is appending the rest of a URL to the end of a page's path. For example when looking at the page that exists at http://example.com/page I would expect to see /page but instead it shows me /page/http://example.com/. The profile has no filters applied to it, and until July was reporting as expected (/page), in July the site in question switched hosts (and absolutely nothing else, so I'm not sure that's the problem). The analytics code on the site is the standard Google Async code with a domain set. All other profiles for the site show /page as expected. Any ideas as to how I can get the profile to function as expected?

    Read the article

  • Maintenance Wizard

    - by LuciaC
    The Maintenance Wizard is an E-Business Suite upgrade tool that can guide you through the code line upgrade process from 11.5.10.2 to 12.1.3 with an 11gR2 database. Additionally, it includes maintenance features for most releases of E-Business Suite applications. The Tool: Presents step-by-step upgrade and maintenance processes Enables validation of each step, tracks the completion of the steps, and maintains a log and status Is a multi-user tool that enables the System Administrator to give different users assignments based on any combination of category, product family or task Automatically installs many required patches Provides project management utilities to record the time taken for each task, completion status and project reporting For More Information:Review Doc ID 215527.1 for additional information on the Maintenance Wizard.See Doc ID 430732.1 to download the new Patch.

    Read the article

  • Cannot start IIS on XP64 Professional

    - by headsling
    I have enabled the iis components for the first time on this machine, from control panel-add software-windows components, and rebooted. When I attempt to start the "Default Web Site" (which shows as 'Default Web Site (Stopped)' ) from inetmgr nothing happens.. no errors, nothing in the event log. Regmon and Filemon show no obvious errors. I've confirmed that nothing is running on port 80 and have tried changing the local port (8080, 8081 etc.,) with no effect. Further info -- I have removed the install, rebooted, added it back in, rebooted ... still not working. I have visual studio 2008, SQLServer 2008 installed.

    Read the article

  • How activity id affects calculations such as schedule % complete when using a baseline?

    - by Jeffrey McDaniel
    Fields such as schedule % complete, planned value costs, etc. that use a baseline to help determine the value depend on the activity id's to match between the baseline project and the current project. If the activity id is changed the link is broken. In the P6 power client there is an internal guid that allows you to change the activity id in either the baseline or current project and still have these values related. In the P6 Reporting Database the activity id is used as the joining characteristic between which activities are a match between a baseline project and a current project.

    Read the article

  • Oracle@Work: IDS bringt mit Exadata Licht ins Investmentcontrolling

    - by A&C Redaktion
    Die Datenmengen, die die IDS GmbH (Analysis and Reporting Services) tagtäglich zu bewältigen hat, sind enorm: Bei der Tochter der Allianz SE sind alle Dienstleistungen rund um das Investmentcontrolling angesiedelt. Das Unternehmen benötigte eine ausbaufähige Datawarehouse-Lösung, in der alle Daten zusammengeführt, harmonisiert und angereichert werden können. Als optimale Lösung fand IDS schließlich zu Exadata, genauer der Oracle Exadata Database Machine. Die Implementierung erfolgte gemeinsam mit dem Oracle Platinum Partner ISE, der den technischen und beratenden Part übernommen hatte und IDS weiterhin bei der Weiterentwicklung unterstützt. Wie Exadata dort zum Einsatz kommt und warum sich diese Investition für IDS gelohnt hat, erfahren Sie im hier im Video:

    Read the article

  • Not assigning Bugs to a specific user

    - by user2977817
    My question: Is there a benefit to NOT assigning a Bug to a particular developer? Leaving it to the team as-a-whole? Our department has decided to be more Agile by not assigning Bugs/Defects to individuals. Using Team Foundation Server 2012, we'll place all Bugs in a development team's "Area" but leave the "Assigned To" field blank. The idea is that the team will create a Task work item which will be assigned to an individual and the Task will link to the Bug. The Team as a whole will therefore take responsibility for the Bug, not an individual, aligning to Scrum - apparently. I see the down side. The reporting tools built into TFS become less useful when you cannot sort by assigned vs unassigned, let alone sorting by which user Bugs are assigned. Is there a benefit I'm not seeing? Besides encouraging teamwork by putting the responsibility on the team-as-a-whole instead of an individual?

    Read the article

  • Suggestion: ALLFILES option for RESTORE

    - by Greg Low
    The default action when performing a backup is to append to the backup file yet the default action when restoring a backup is to restore just the first file.I constantly come across customer situations where they are puzzled that they seem to have lost data after they have completed a restore. Invariably, it's just that they haven't restored all the backups contained within a single OS file. This happens most commonly with log backups but also happens when they have not restored the most recent database backup file.It is not trivial to achieve this within simple T-SQL scripts, when the number of backup files within the OS file is unknown. It really should be.I'd like to see a FILES=ALLFILES option on the RESTORE command. For RESTORE DATABASE, it should restore the most recent database backup plus any subsequent log files. For RESTORE LOG (which is the most important missing option), it should just restore all relevant log backups that are contained.If you agree, you know what to do: please vote:  https://connect.microsoft.com/SQLServer/feedback/details/769204/option-to-restore-all-backups-files-within-a-media-setAlternately, how would you write a T-SQL command to restore all log backups within a single OS file where the number of files is unknown? Would love to hear creative solutions because all the ones that I think of are pretty messy and need dynamic SQL. 

    Read the article

  • SSRS Report Parts Versus Sub Reports FAQ

    I'm trying to decide on a development strategy to satisfy the reporting needs in my organization. I would like to increase our efficiency in responding to report requests, while minimizing our maintenance burden. Two topics that I would like to dig in to are Report Parts and Subreports. Can you provide some considerations for using one versus the other? NEW! SQL Monitor 2.0Monitor SQL Server Central's servers withRed Gate's new SQL Monitor.No installation required. Find out more.

    Read the article

  • OpenWorld Presentations and Anatomy of an RTF Template w/ files

    - by mdonohue
    For those who missed it ... or those who made it and couldn't get enough, check out the presentations delivered at OpenWorld: Overview and Roadmap The Reporting Platform for Oracle Applications Best Practices and even though it wasn't presented at OpenWorld an updated version of Anatomy of an RTF Template to include documented example files  (RTF template, Sub-Template and sample XML data) so you can re-use and play with the code directly.  Huge thanks to Tim and Hok-Min who did all the hard, original work on this example loaded with tips and tricks.  

    Read the article

  • Resize a 2TB partition on a 3TB disk created with fdisk

    - by mR_fr0g
    I recently added a new 3TB hard drive to a headless media server (HP proliant microserver) running Ubuntu server 12.04. I followed this tutorial, which uses fdisk to create a single partition of the maximum size reported by fdisk. I have choosen ext4 format. I then copied across all my media, which took some time. I am guessing that fidisk has a 2TB limit, because du is reporting this as the size. Is there any way to increase the size of the partition to 3TB without having to copy all my media over again?

    Read the article

  • Workstation Build: Single 2.66ghz i-7 with overclock potential, OR Dual 5520 2.26ghz Xeons?

    - by jdc0589
    There are probably better places to ask this, but I am used to the excellent quality of responses on stack overflow. I am rebuilding my desktop in a few months. Aside from normal lightweight internet usage, I use it to run sqlServer, mySql, 1-2 Ubuntu VMs from time to time, lots of IDE's, and a media server for my PS3. The two possible setups cost the exact same amount (within $50) and would both have 12gb 1333mhz ddr3 ram and a 500gb RAID-0 array (250x2). Now, If I go with a single i-7 920 2.66ghz quadcore, I can easily overclock it to 3ghz, and would have cash leftover to get a 160gb ssd (either the ocz vertex or the 120gb intel) for the main OS/Program install drive. Else, I could get a dual lga1366 motherboard with two e5520 Xeon's (2.26ghz),just use the disks I already have. So, do I go for 8 physical/16 virtual cores at 2.26ghz (No overclocking on server boards) with normal disk I/O, or a 4 physical/8 virtual cores at 3.0ghz with really outstanding disk I/O?

    Read the article

  • Business Analytics Newsletter v6 is out

    - by THE
    Our latest Business Analytics Newsletter (v6) has just gone live. This edition features the topics: Profitability and Cost Management on Exalytics Hyperion Calculation Manager Oracle By Example - New Tutorial OBIEE 11g: Current and Future Patching Strategy OBIEE releases EPM Patch Set Updates - Recent Releases Product Retirement    -Hyperion Application Builder    -Oracle Essbase Spreadsheet Add-In    -OBIEE 11.1.1.5    -Hyperion Financial Reporting XBRL Functionality Of course you also find all relevant links to webcasts, communities, whitepapers and Social media etc. regarding Oracle Business Analytics in this newsletter. You find the newsletter under  https://support.oracle.com/rs?type=doc&id=1347131.1

    Read the article

  • Introducing Next-Generation Enterprise Auditing and Database Firewall Platform Webcast, 12/12/12

    - by Troy Kitch
    Join us, December 12 at 10am PT/1pm ET, to hear about a new Oracle product that monitors Oracle and non-Oracle database traffic, detects unauthorized activity including SQL injection attacks, and blocks internal and external threats from reaching the database. In addition, this new product collects and consolidates audit data from databases, operating systems, directories, and any custom template-defined source into a centralized, secure warehouse. This new enterprise security monitoring and auditing platform allows organizations to quickly detect and respond to threats with powerful real-time policy analysis, alerting and reporting capabilities. Based on proven SQL grammar analysis that ensures accuracy, performance, and scalability, organizations can deploy with confidence in any mode. You will also hear how organizations such as TransUnion Interactive and SquareTwo Financial rely on Oracle today to monitor and secure their Oracle and non-Oracle database environments. Register for the webcast here.

    Read the article

  • Partner case - ISE (Germany) - IDS brings light into Investment Controlling with Exadata

    - by Javier Puerta
    (Original post in German: IDS bringt mit Exadata Licht ins Investmentcontrolling) "The amount of data that IDS GmbH (Analysis and Reporting Services) has to cope with daily, is enormous: at the subsidiary of Allianz SE all the services are around Investment Controlling.The company needed an extensible data warehouse solution in which all the data could be merged together, harmonized and enriched. Finally IDS decided for Exadata to be as optimal solution, specifically the Oracle Exadata Database Machine. The implementation was carried out jointly with the Oracle Platinum Partner ISE, who took over the technical and advisory support part and will be IDS´ preffered consultant in any further Exadata development. See how Exadata is used and why this investment has paid off for IDS, by watching watching the following video (in German)"

    Read the article

  • April 11: Live Webcast for Oracle Configuration Controls Governor (CCG) for PeopleSoft 9.1

    - by Theresa Hickman
    Are you a PeopleSoft 9.1 Financials, HCM, or Campus Solutions customer who would like to know how you can automatically track changes to key configurations of these applications? With increasing regulatory requirements and the complex reporting required to meet these corporate compliance objectives, manual tracking of changes is not the ideal option and is prone to error and increased risk for fraud. Speakers from Oracle, Accenture & FulcrumWay will explain the business benefits of Oracle GRC change management solutions and present a business use case using a leading Healthcare company.  When: April 11, 2012 Time: 11:00 am (PST); 2:00pm (EST) Duration: 1 hr Register Now!

    Read the article

  • Summarising and Bubbling of KPI data

    - by simonsabin
    Something I’m very conscious of when delivering a  BI solution is being able to show the facts in a concise way but also not to hide whats going on. I was reminded of this when I looked at the weather today. Everywhere they are reporting weather warnings for the south east and so I though I’d check on the BBC website http://news.bbc.co.uk/weather/forecast/4281?area=AL5 Looking at that I thought we are going to miss the worst of it, just like a few weeks ago. However from previous experience...(read more)

    Read the article

  • Improve Bad testing

    - by SetiSeeker
    We have a large team of developers and testers. The ratio is one tester for every one developer. We have full bug tracking and reporting systems in place. We have test plans in place. Every change to the product, the testing team is involved in the design of the feature and are included in the development process as much as possible. We build in small iterative blocks, using scrum methodology and every scrum they are included in, including the grooming sessions etc. But every release of the product, they miss even the most simple and obvious defects. How can we improve this?

    Read the article

  • Is 500 million lines of code even remotely possible? [on hold]

    - by kmote
    The New York Times is reporting that the Healthcare.gov website contains "about 500 million lines of software code." This number, attributed to "one specialist", and widely repeated across the interwebs, seems incredibly far-fetched (even assuming a large fraction of that number includes standard libraries). If this is an accurate estimate, it would truly be staggering (as this fascinating infographic vividly reveals). I realize StackExchange:Programmers isn't Snopes.com, but I'd like to find out if anyone here believes this is even remotely possible. I'd like to know if there is a plausible system of accounting (using examples from publicly available data, if possible) that could lead someone to conclude that such an estimate is within the realm of reason. How could a codebase (by any measure) sum up to such an exhorbitant number of code lines?

    Read the article

  • Getting Error whileInitializing entities [closed]

    - by R76
    I am new'b as WPF Dev. I am developing Window application in WPF using mvvmlight framework. I have created database in Sqlserver compact 4.0. I have made a Ado.net Entity Data Model. When I trying to initialize the Entity object in service it throws the error like: Error 'The invocation of the constructor on type 'PointOfSale.ViewModels.ProductsViewModel' that matches the specified binding constraints threw an exception.' Line number '7' and line position '10'. stack Trace at System.Windows.Markup.XamlReader.RewrapException(Exception e, IXamlLineInfo lineInfo, Uri baseUri) at System.Windows.Markup.WpfXamlLoader.Load(XamlReader xamlReader, IXamlObjectWriterFactory writerFactory, Boolean skipJournaledProperties, Object rootObject, XamlObjectWriterSettings settings, Uri baseUri) at System.Windows.Markup.WpfXamlLoader.LoadBaml(XamlReader xamlReader, Boolean skipJournaledProperties, Object rootObject, XamlAccessLevel accessLevel, Uri baseUri) at System.Windows.Markup.XamlReader.LoadBaml(Stream stream, ParserContext parserContext, Object parent, Boolean closeStream) at System.Windows.Application.LoadComponent(Object component, Uri resourceLocator) at PointOfSale.MainWindow.InitializeComponent() in e:\VarniApplication\PointOfSale\PointOfSale\MainWindow.xaml:line 1 at PointOfSale.MainWindow..ctor() in E:\VarniApplication\PointOfSale\PointOfSale\MainWindow.xaml.cs:line 27 Inner Exception {"Unable to load the specified metadata resource."} My code: xyzEntities entites; public ctor() { entites = new xyzEntities(); //This line throws an error } I have installed sql server compact 4.0 from web installer 3.0. and added the sql server compact toolbox from the extension manager. Tell me if I am missing something to install or missing something to write code.

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >