Search Results

Search found 3137 results on 126 pages for 'reporting avatar'.

Page 62/126 | < Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >

  • Introducing Microsoft SQL Server 2008 R2 - Business Intelligence Samples

    - by smisner
    On April 14, 2010, Microsoft Press (blog | twitter) released my latest book, co-authored with Ross Mistry (twitter), as a free ebook download - Introducing Microsoft SQL Server 2008 R2. As the title implies, this ebook is an introduction to the latest SQL Server release. Although you'll find a comprehensive review of the product's features in this book, you will not find the step-by-step details that are typical in my other books. For those readers who are interested in a more interactive learning experience, I have created two samples file for download: IntroSQLServer2008R2Samples project Sales Analysis workbook Here's a recap of the business intelligence chapters and the samples I used to generate the screen shots by chapter: Chapter 6: Scalable Data Warehousing covers a new edition of SQL Server, Parallel Data Warehouse. Understandably, Microsoft did not ship me the software and hardware to set up my own Parallel Data Warehouse environment for testing purposes and consequently you won't see any screenshots in this chapter. I received a lot of information and a lot of help from the product team during the development of this chapter to ensure its technical accuracy. Chapter 7: Master Data Services is a new component in SQL Server. After you install Master Data Services (MDS), which is a separate installation from SQL Server although it's found on the same media, you can install sample models to explore (which is what I did to create screenshots for the book). To do this, you deploying packages found at \Program Files\Microsoft SQL Server\Master Data Services\Samples\Packages. You will first need to use the Configuration Manager (in the Microsoft SQL Server 2008 R2\Master Data Services program group) to create a database and a Web application for MDS. Then when you launch the application, you'll see a Getting Started page which has a Deploy Sample Data link that you can use to deploy any of the sample packages. Chapter 8: Complex Event Processing is an introduction to another new component, StreamInsight. This topic was way too large to cover in-depth in a single chapter, so I focused on information such as architecture, development models, and an overview of the key sections of code you'll need to develop for your own applications. StreamInsight is an engine that operates on data in-flight and as such has no user interface that I could include in the book as screenshots. The November CTP version of SQL Server 2008 R2 included code samples as part of the installation, but these are not the official samples that will eventually be available in Codeplex. At the time of this writing, the samples are not yet published. Chapter 9: Reporting Services Enhancements provides an overview of all the changes to Reporting Services in SQL Server 2008 R2, and there are many! In previous posts, I shared more details than you'll find in the book about new functions (Lookup, MultiLookup, and LookupSet), properties for page numbering, and the new global variable RenderFormat. I will confess that I didn't use actual data in the book for my discussion on the Lookup functions, but I did create real reports for the blog posts and will upload those separately. For the other screenshots and examples in the book, I have created the IntroSQLServer2008R2Samples project for you to download. To preview these reports in Business Intelligence Development Studio, you must have the AdventureWorksDW2008R2 database installed, and you must download and install SQL Server 2008 R2. For the map report, you must execute the PopulationData.sql script that I included in the samples file to add a table to the AdventureWorksDW2008R2 database. The IntroSQLServer2008R2Samples project includes the following files: 01_AggregateOfAggregates.rdl to illustrate the use of embedded aggregate functions 02_RenderFormatAndPaging.rdl to illustrate the use of page break properties (Disabled, ResetPageNumber), the PageName property, and the RenderFormat global variable 03_DataSynchronization.rdl to illustrate the use of the DomainScope property 04_TextboxOrientation.rdl to illustrate the use of the WritingMode property 05_DataBar.rdl 06_Sparklines.rdl 07_Indicators.rdl 08_Map.rdl to illustrate a simple analytical map that uses color to show population counts by state PopulationData.sql to provide the data necessary for the map report Chapter 10: Self-Service Analysis with PowerPivot introduces two new components to the Microsoft BI stack, PowerPivot for Excel and PowerPivot for SharePoint, which you can learn more about at the PowerPivot site. To produce the screenshots for this chapter, I created the Sales Analysis workbook which you can download (although you must have Excel 2010 and the PowerPivot for Excel add-in installed to explore it fully). It's a rather simple workbook because space in the book did not permit a complete exploration of all the wonderful things you can do with PowerPivot. I used a tutorial that was available with the CTP version as a basis for the report so it might look familiar if you've already started learning about PowerPivot. In future posts, I'll continue exploring the new features in greater detail. If there's any special requests, please let me know! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • How to submit sitemap when your website has partial https? - Error: "Not in Domain"

    - by Ralph N
    My website is an ecommerce that is set up to do http for the item browsing portion, but https for things like shopping cart, contact us, etc.. (anything that has forms on it). I've submitted my website a long time ago to google webmaster tools as http://www.mywebsite.com. I also submitted a sitemap with about 40 links - 8 of them are https. I've noticed that for the longest time, google webmaster tools was reporting that 32 out of the 40 links have been crawled. I tested all the links against my robots.txt and realized that my robots text was blocking the https links. Google says those links are "Not In Domain". Is there a way i'm supposed to get around this so that I can have a hybrid-ssl site? I understand the concept that one site is mywebsite.com:80 and the other is mywebsite.com:443, but i'd like to avoid submitting and maintaining 2 seperate websites on google webmaster tools.

    Read the article

  • Sharp HealthCare Reduces Storage Requirements by 50% with Oracle Advanced Compression

    - by [email protected]
    Sharp HealthCare is an award-winning integrated regional health care delivery system based in San Diego, California, with 2,600 physicians and more than 14,000 employees. Sharp HealthCare's data warehouse forms a vital part of the information system's infrastructure and is used to separate business intelligence reporting from time-critical health care transactional systems. Faced with tremendous data growth, Sharp HealthCare decided to replace their existing Microsoft products with a solution based on Oracle Database 11g and to implement Oracle Advanced Compression. Join us to hear directly from the primary DBA for the Data Warehouse Application Team, Kim Nguyen, how the new environment significantly reduced Sharp HealthCare's storage requirements and improved query performance.

    Read the article

  • Oracle: Addressing Information Overload in Factory Automation

    - by [email protected]
     ORACLE's Stephen Slade has written about addressing information overload on the factory floor.  According to Slade, today's automated processes create large amounts of valuable data, but only a small percentage remains actionable.Oracle claims information overload can cost financially, as companies struggle to store and collect reams of data needed to identify embedded trends, while producing manual reports to meet quality standards, regulatory requirements and general reporting goals.Increasing scrutiny of new requirements and standards add to the need to find new ways to process data. Many companies are now using analytical engines to contextualise data into 'actionable information'. Oracle claims factories need to seriously address their data collection, audit trail and records retention processes. By organising their data, factories can maximise outcomes from excellence and contuinuous improvement programs, and gain visibility into costs int the supply chain.Analytics tools and technologies such as Business Intelligence (BI), Enterprise Manufacturing Intelligence (EMI) and Manufacturing Operations Centers (MOC) can help consolidate, contextual and distribute information.   FULL ARICLE:  http://www.myfen.com.au/news/oracle--addressing-information-overload-in-factory

    Read the article

  • Smarter, Faster, Cheaper: The Insurance Industry’s Dream

    - by Jenna Danko
    On June 3rd, I saw the Gaylord Resort Centre in Washington D.C. become the hub of C level executives and managers of insurance carriers for the IASA 2013 Conference.  Insurance Accounting/Regulation and Technology sessions took the focus, but there were plenty of tertiary sessions for career development, which complemented the overall strong networking side of the conference.  As an exhibitor, Oracle, along with several hundred other product providers, welcomed the opportunity to display and demonstrate our solutions and we were encouraged by hustle and bustle of the exhibition floor.  The IASA organizers had pre-arranged fast track tours whereby interested conference delegates could sign up for a series of like-themed presentations from Vendors, giving them a level of 'Speed Dating' introductions to possible solutions and services.  Oracle participated in a number of these, which were very well subscribed.  Clearly, the conference had a strong business focus; however, attendees saw technology as a key enabler to get their processes done smarter, faster and cheaper.  As we navigated through the exhibition, it became clear from the inquiries that came to us that insurance carriers are gravitating to a number of focus areas: Navigating the maze of upcoming regulatory reporting changes. For US carriers with European holdings, Solvency II carries a myriad of rules and reporting requirements. Alignment across the globe of the Own Risk and Solvency Assessment (ORSA) processes brings to the fore the National Insurance of Insurance commissioners' (NAIC) recent guidance manual publication. Doing more with less and to certainly expect more from technology for less dollars. The overall cost of IT, in particular hardware, has dropped in real terms (though the appetite for more has risen: more CPU, more RAM, more storage), but software has seen less change. Clearly, customers expect either to pay less or get a lot more from their software solutions for the same buck. Doing things smarter – A recognition that with the advance of technology to stand still no longer means you are technically going backwards. Technology and, in particular technology interactions with human business processes, has undergone incredible change over the past 5 years. Consumer usage (iPhones, etc.) has been at the forefront, but now at the Enterprise level ever more effective technology exploitation is beginning to take place. That data and, in particular gleaning knowledge from data, is refining and improving business processes.  Organizations are now consuming more data than ever before, and it is set to grow exponentially for some time to come.  Amassing large volumes of data is one thing, but effectively analyzing that data is another.  It is the results of such analysis that leads to improvements both in terms of insurance product offerings and the processes to support them. Regulatory Compliance, damned if you do and damned if you don’t! Clearly, around the globe at lot is changing from a regulatory perspective and it is evident that in terms of regulatory requirements, whilst there is a greater convergence across jurisdictions bringing uniformity, there is also a lot of work to be done in the next 5 years. Just like the big data, hidden behind effective regulatory compliance there often lies golden nuggets that can give competitive advantages. From Oracle's perspective, our Rating Engine, Billing, Document Management and Insurance Analytics solutions on display served to strike up good conversations and, as is always the case at conferences, it was a great opportunity to meet and speak with existing Oracle customers that we might not have otherwise caught up with for a while. Fortunately, I was able to catch up on a few sessions at the close of the Exhibition.  The speaker quality was high and the audience asked challenging, but pertinent, questions.  During Dr. Jackie Freiberg’s keynote “Bye Bye Business as Usual,” the author discussed 8 strategies to help leaders create a culture where teams consistently deliver innovative ideas by disrupting the status quo.  The very first strategy: Get wired for innovation.  Freiberg admitted that folks in the insurance and financial services industry understand and know innovation is important, but oftentimes they are slow adopters.  Today, technology and innovation go hand in hand. In speaking to delegates during and after the conference, a high degree of satisfaction could be measured from their positive comments of speaker sessions and the exhibitors. I suspect many will be back in 2014 with Indianapolis as the conference location. Did you attend the IASA Conference in Washington D.C.?  If so, I would love to hear your comments. Andrew Collins is the Director, Solvency II of Oracle Financial Services. He can be reached at andrew.collins AT oracle.com.

    Read the article

  • Microsoft Team Foundation Server 2010 Service Pack 1

    - by javarg
    Last week Microsoft has released the first Service Pack for Team Foundation Server. Several issues have been fixed and included in this patch. Check out the list of fixes here. Cool stuff has been shipped with this new released, such as the expected Project Service Integration. PS: note that these annoying bugs has been fixed: Team Explorer: When you use a Visual Studio 2005 or a Visual Studio 2008 client, you encounter a red "X" on the reporting node of the team explorer. Source Control: You receive the error "System.IO.IOException: Unable to read data from the transport connection: The connection was closed." when you try to download a source

    Read the article

  • SQL SERVER – CXPACKET – Parallelism – Usual Solution – Wait Type – Day 6 of 28

    - by pinaldave
    CXPACKET has to be most popular one of all wait stats. I have commonly seen this wait stat as one of the top 5 wait stats in most of the systems with more than one CPU. Books On-Line: Occurs when trying to synchronize the query processor exchange iterator. You may consider lowering the degree of parallelism if contention on this wait type becomes a problem. CXPACKET Explanation: When a parallel operation is created for SQL Query, there are multiple threads for a single query. Each query deals with a different set of the data (or rows). Due to some reasons, one or more of the threads lag behind, creating the CXPACKET Wait Stat. There is an organizer/coordinator thread (thread 0), which takes waits for all the threads to complete and gathers result together to present on the client’s side. The organizer thread has to wait for the all the threads to finish before it can move ahead. The Wait by this organizer thread for slow threads to complete is called CXPACKET wait. Note that not all the CXPACKET wait types are bad. You might experience a case when it totally makes sense. There might also be cases when this is unavoidable. If you remove this particular wait type for any query, then that query may run slower because the parallel operations are disabled for the query. Reducing CXPACKET wait: We cannot discuss about reducing the CXPACKET wait without talking about the server workload type. OLTP: On Pure OLTP system, where the transactions are smaller and queries are not long but very quick usually, set the “Maximum Degree of Parallelism” to 1 (one). This way it makes sure that the query never goes for parallelism and does not incur more engine overhead. EXEC sys.sp_configure N'cost threshold for parallelism', N'1' GO RECONFIGURE WITH OVERRIDE GO Data-warehousing / Reporting server: As queries will be running for long time, it is advised to set the “Maximum Degree of Parallelism” to 0 (zero). This way most of the queries will utilize the parallel processor, and long running queries get a boost in their performance due to multiple processors. EXEC sys.sp_configure N'cost threshold for parallelism', N'0' GO RECONFIGURE WITH OVERRIDE GO Mixed System (OLTP & OLAP): Here is the challenge. The right balance has to be found. I have taken a very simple approach. I set the “Maximum Degree of Parallelism” to 2, which means the query still uses parallelism but only on 2 CPUs. However, I keep the “Cost Threshold for Parallelism” very high. This way, not all the queries will qualify for parallelism but only the query with higher cost will go for parallelism. I have found this to work best for a system that has OLTP queries and also where the reporting server is set up. Here, I am setting ‘Cost Threshold for Parallelism’ to 25 values (which is just for illustration); you can choose any value, and you can find it out by experimenting with the system only. In the following script, I am setting the ‘Max Degree of Parallelism’ to 2, which indicates that the query that will have a higher cost (here, more than 25) will qualify for parallel query to run on 2 CPUs. This implies that regardless of the number of CPUs, the query will select any two CPUs to execute itself. EXEC sys.sp_configure N'cost threshold for parallelism', N'25' GO EXEC sys.sp_configure N'max degree of parallelism', N'2' GO RECONFIGURE WITH OVERRIDE GO Read all the post in the Wait Types and Queue series. Additionally a must read comment of Jonathan Kehayias. Note: The information presented here is from my experience and I no way claim it to be accurate. I suggest you all to read the online book for further clarification. All the discussion of Wait Stats over here is generic and it varies from system to system. It is recommended that you test this on the development server before implementing on the production server. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: DMV, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL Wait Stats, SQL Wait Types, T SQL, Technology

    Read the article

  • SQL SERVER – Simple Example to Configure Resource Governor – Introduction to Resource Governor

    - by pinaldave
    Let us jump right away with question and answer mode. What is resource governor? Resource Governor is a feature which can manage SQL Server Workload and System Resource Consumption. We can limit the amount of CPU and memory consumption by limiting /governing /throttling on the SQL Server. Why is resource governor required? If there are different workloads running on SQL Server and each of the workload needs different resources or when workloads are competing for resources with each other and affecting the performance of the whole server resource governor is a very important task. What will be the real world example of need of resource governor? Here are two simple scenarios where the resource governor can be very useful. Scenario 1: A server which is running OLTP workload and various resource intensive reports on the same server. The ideal situation is where there are two servers which are data synced with each other and one server runs OLTP transactions and the second server runs all the resource intensive reports. However, not everybody has the luxury to set up this kind of environment. In case of the situation where reports and OLTP transactions are running on the same server, limiting the resources to the reporting workload it can be ensured that OTLP’s critical transaction is not throttled. Scenario 2: There are two DBAs in one organization. One DBA A runs critical queries for business and another DBA B is doing maintenance of the database. At any point in time the DBA A’s work should not be affected but at the same time DBA B should be allowed to work as well. The ideal situation is that when DBA B starts working he get some resources but he can’t get more than defined resources. Does SQL Server have any default resource governor component? Yes, SQL Server have two by default created resource governor component. 1) Internal –This is used by database engine exclusives and user have no control. 2) Default – This is used by all the workloads which are not assigned to any other group. What are the major components of the resource governor? Resource Pools Workload Groups Classification In simple words here is what the process of resource governor is. Create resource pool Create a workload group Create classification function based on the criteria specified Enable Resource Governor with classification function Let me further explain you the same with graphical image. Is it possible to configure resource governor with T-SQL? Yes, here is the code for it with explanation in between. Step 0: Here we are assuming that there are separate login accounts for Reporting server and OLTP server. /*----------------------------------------------- Step 0: (Optional and for Demo Purpose) Create Two User Logins 1) ReportUser, 2) PrimaryUser Use ReportUser login for Reports workload Use PrimaryUser login for OLTP workload -----------------------------------------------*/ Step 1: Creating Resource Pool We are creating two resource pools. 1) Report Server and 2) Primary OLTP Server. We are giving only a few resources to the Report Server Pool as described in the scenario 1 the other server is mission critical and not the report server. ----------------------------------------------- -- Step 1: Create Resource Pool ----------------------------------------------- -- Creating Resource Pool for Report Server CREATE RESOURCE POOL ReportServerPool WITH ( MIN_CPU_PERCENT=0, MAX_CPU_PERCENT=30, MIN_MEMORY_PERCENT=0, MAX_MEMORY_PERCENT=30) GO -- Creating Resource Pool for OLTP Primary Server CREATE RESOURCE POOL PrimaryServerPool WITH ( MIN_CPU_PERCENT=50, MAX_CPU_PERCENT=100, MIN_MEMORY_PERCENT=50, MAX_MEMORY_PERCENT=100) GO Step 2: Creating Workload Group We are creating two workloads each mapping to each of the resource pool which we have just created. ----------------------------------------------- -- Step 2: Create Workload Group ----------------------------------------------- -- Creating Workload Group for Report Server CREATE WORKLOAD GROUP ReportServerGroup USING ReportServerPool ; GO -- Creating Workload Group for OLTP Primary Server CREATE WORKLOAD GROUP PrimaryServerGroup USING PrimaryServerPool ; GO Step 3: Creating user defiled function which routes the workload to the appropriate workload group. In this example we are checking SUSER_NAME() and making the decision of Workgroup selection. We can use other functions such as HOST_NAME(), APP_NAME(), IS_MEMBER() etc. ----------------------------------------------- -- Step 3: Create UDF to Route Workload Group ----------------------------------------------- CREATE FUNCTION dbo.UDFClassifier() RETURNS SYSNAME WITH SCHEMABINDING AS BEGIN DECLARE @WorkloadGroup AS SYSNAME IF(SUSER_NAME() = 'ReportUser') SET @WorkloadGroup = 'ReportServerGroup' ELSE IF (SUSER_NAME() = 'PrimaryServerPool') SET @WorkloadGroup = 'PrimaryServerGroup' ELSE SET @WorkloadGroup = 'default' RETURN @WorkloadGroup END GO Step 4: In this final step we enable the resource governor with the classifier function created in earlier step 3. ----------------------------------------------- -- Step 4: Enable Resource Governer -- with UDFClassifier ----------------------------------------------- ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION=dbo.UDFClassifier); GO ALTER RESOURCE GOVERNOR RECONFIGURE GO Step 5: If you are following this demo and want to clean up your example, you should run following script. Running them will disable your resource governor as well delete all the objects created so far. ----------------------------------------------- -- Step 5: Clean Up -- Run only if you want to clean up everything ----------------------------------------------- ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION = NULL) GO ALTER RESOURCE GOVERNOR DISABLE GO DROP FUNCTION dbo.UDFClassifier GO DROP WORKLOAD GROUP ReportServerGroup GO DROP WORKLOAD GROUP PrimaryServerGroup GO DROP RESOURCE POOL ReportServerPool GO DROP RESOURCE POOL PrimaryServerPool GO ALTER RESOURCE GOVERNOR RECONFIGURE GO I hope this introductory example give enough light on the subject of Resource Governor. In future posts we will take this same example and learn a few more details. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Resource Governor

    Read the article

  • SQL SERVER – PHP on Windows and SQL Server Training Kit

    - by pinaldave
    The PHP on Windows and SQL Server Training Kit includes a comprehensive set of technical content including demos and hands-on labs to help you understand how to build PHP applications using Windows, IIS 7.5 and SQL Server 2008 R2. This release includes the following: PHP & SQL Server Demos Integrating SQL Server Geo-Spatial with PHP SQL Server Reporting Services and PHP PHP & SQL Server Hands On Labs Introduction to Using SQL Server with PHP Using SQL Server Full-Text Search and FILESTREAM Storage with PHP New: Getting Started with SQL Server Migration Assistant for MySQL Download SQL Server PHP on Windows and SQL Server Training Kit Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • PeopleSoft and Fusion Middleware White Paper

    - by david.bain
    We all know that PeopleTools is a very productive Enterprise Application Platform. It provides business logic, ui, reporting, integration etc.. . . virtually the entire stack. The question many PeopleSoft users have is 'If I have PeopleSoft, what can Fusion Middleware do for me?'. An excellent question. A white paper has just been published that answers that question. It's available on the www.oracle.com/peoplesoft site under the 'White Paper' link. Select the link that says 'Read this White Paper to learn how your PeopleSoft Application can benefit from Oracle Fusion Middleware'. After you've read the paper and are interested in more details, be sure to visit the PeopleSoft - Fusion Middleware Best Practice Center here: http://www.oracle.com/technology/tech/fmw4apps/peoplesoft/index.html

    Read the article

  • WCF Error when using “Match Data” function in MDS Excel AddIn

    - by Davide Mauri
    If you’re using MDS and DQS with the Excel Integration you may get an error when trying to use the “Match Data” feature that uses DQS in order to help to identify duplicate data in your data set. The error is quite obscure and you have to enable WCF error reporting in order to have the error details and you’ll discover that they are related to some missing permission in MDS and DQS_STAGING_DATA database. To fix the problem you just have to give the needed permession, as the following script does: use MDS go GRANT SELECT ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] GRANT INSERT ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] GRANT DELETE ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] GRANT UPDATE ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] USE [DQS_STAGING_DATA] GO ALTER AUTHORIZATION ON SCHEMA::[db_datareader] TO [VMSRV02\mdsweb] ALTER AUTHORIZATION ON SCHEMA::[db_datawriter] TO [VMSRV02\mdsweb] ALTER AUTHORIZATION ON SCHEMA::[db_ddladmin] TO [VMSRV02\mdsweb] GO Where “VMSRV02\mdsweb” is the user you configured for MDS Service execution. If you don’t remember it, you can just check which account has been assigned to the IIS application pool that your MDS website is using:

    Read the article

  • How to name a bug?

    - by Pieter
    Bugs usually receive a descriptive name: "That X-Y synchronization issue", "That crash after actions A, B and D but not C", "Yesterday's update problem". Even the JIRA issue tracker has a field "Summary" instead of "Name". In discussing "big" bugs, I actually use JIRA id's to prevent confusion. There's a few restrictions to take into account: When reporting a bug, only the consequence of a bug is known. The root cause might never even be found. Several reported bugs might be found out to be duplicates, or might be completely different consequences of the same bug. In large projects, bugs will come at you by the dozens every month. Now, how would you name a bug? Name them like hurricanes perhaps?

    Read the article

  • Why do we keep using CSV?

    - by Stephen
    Why do we keep using CSV? I recently made a shift to working the health domain and despite the wonderful work in data transfer standards, all data transfer is in CSV, both for reporting to external organisations, and for data migrations when implementing new systems. Unfortunately the use of CSV is the cause of the endless repetition of the same stupid errors, with the same waste of developer time. (bad escaping, failing to handle null fields etc.) I know we can do better, and anything between JSON and XML (depending on the instance) would be fine. (Most of the time this is data going from one MS SQLserver 2005 to another!) I feel as if each time I see this happening I am literally watching one developer waste anothers time. So why do we keep shafting each other? When will we stop?

    Read the article

  • Google Webmaster Tools reports fake 404 errors

    - by Edgar Quintero
    I have a website where Google Webmaster Tools reports 15,000 links as 404 errors. However, all links return a 200 when I visit them. The problem is, that eventhough I can visit these pages and return a 200, all those 15,000 pages won't index in Google. They aren't appearing in search results. These are constant errors Google Webmaster Tools keeps reporting and I'm not sure what the problem is. We've thought of a DNS issue, but it shouldn't be a DNS issue, because if it were, no page would be indexed (I have 10,000 perfectly indexed). Regarding URL parameters, my pages do not share a similarity in URL parameters that can make it obvious to me what could be causing the error.

    Read the article

  • SaaS Customer Service Matters

    - by charles.knapp
    You probably know that Oracle CRM On Demand goes beyond contact and transaction tracking by providing valuable real-time insights. Do you know that Oracle CRM On Demand also delivers valuable service to our customers? Don't take my word for it. "Prior to Oracle CRM On Demand, we were too busy looking in the rear view mirror on our sales activities and needed a forward-looking tool to maximize sales and coaching opportunities," said Christian Doelle, Vice President Sales & Marketing, MonierLifetile. "After evaluating other organization's solutions, we found Oracle as the most proven with the real-time reporting and detailed reviews of sales opportunities that helped us to address our blind spots. Additionally, we have found throughout our implementation phase that Oracle's commitment to customer attention and service is incomparable." Learn more here about MonierLifetile's experience with Oracle CRM On Demand.

    Read the article

  • Puppet: Making Windows Awesome Since 2011

    - by Robz / Fervent Coder
    Originally posted on: http://geekswithblogs.net/robz/archive/2014/08/07/puppet-making-windows-awesome-since-2011.aspxPuppet was one of the first configuration management (CM) tools to support Windows, way back in 2011. It has the heaviest investment on Windows infrastructure with 1/3 of the platform client development staff being Windows folks.  It appears that Microsoft believed an end state configuration tool like Puppet was the way forward, so much so that they cloned Puppet’s DSL (domain-specific language) in many ways and are calling it PowerShell DSC. Puppet Labs is pushing the envelope on Windows. Here are several things to note: Puppet x64 Ruby support for Windows coming in v3.7.0. An awesome ACL module (with order, SIDs and very granular control of permissions it is best of any CM). A wealth of modules that work with Windows on the Forge (and more on GitHub). Documentation solely for Windows folks - https://docs.puppetlabs.com/windows. Some of the common learning points with Puppet on Windows user are noted in this recent blog post. Microsoft OpenTech supports Puppet. Azure has the ability to deploy a Puppet Master (http://puppetlabs.com/solutions/microsoft). At Microsoft //Build 2014 in the Day 2 Keynote Puppet Labs CEO Luke Kanies co-presented with Mark Russonivich (http://channel9.msdn.com/Events/Build/2014/KEY02  fast forward to 19:30)! Puppet has a Visual Studio Plugin! It can be overwhelming learning a new tool like Puppet at first, but Puppet Labs has some resources to help you on that path. Take a look at the Learning VM, which has a quest-based learning tool. For real-time questions, feel free to drop onto #puppet on freenode.net (yes, some folks still use IRC) with questions, and #puppet-dev with thoughts/feedback on the language itself. You can subscribe to puppet-users / puppet-dev mailing lists. There is also ask.puppetlabs.com for questions and Server Fault if you want to go to a Stack Exchange site. There are books written on learning Puppet. There are even Puppet User Groups (PUGs) and other community resources! Puppet does take some time to learn, but with anything you need to learn, you need to weigh the benefits versus the ramp up time. I learned NHibernate once, it had a very high ramp time back then but was the only game on the street. Puppet’s ramp up time is considerably less than that. The advantage is that you are learning a DSL, and it can apply to multiple platforms (Linux, Windows, OS X, etc.) with the same Puppet resource constructs. As you learn Puppet you may wonder why it has a DSL instead of just leveraging the language of Ruby (or maybe this is one of those things that keeps you up wondering at night). I like the DSL over a small layer on top of Ruby. It allows the Puppet language to be portable and go more places. It makes you think about the end state of what you want to achieve in a declarative sense instead of in an imperative sense. You may also find that right now Puppet doesn’t run manifests (scripts) in order of the way resources are specified. This is the number one learning point for most folks. As a long time consternation of some folks about Puppet, manifest ordering was not possible in the past. In fact it might be why some other CMs exist! As of 3.3.0, Puppet can do manifest ordering, and it will be the default in Puppet 4. http://puppetlabs.com/blog/introducing-manifest-ordered-resources You may have caught earlier that I mentioned PowerShell DSC. But what about DSC? Shouldn’t that be what Windows users want to choose? Other CMs are integrating with DSC, will Puppet follow suit and integrate with DSC? The biggest concern that I have with DSC is it’s lack of visibility in fine-grained reporting of changes (which Puppet has). The other is that it is a very young Microsoft product (pre version 3, you know what they say :) ). I tried getting it working in December and ran into some issues. I’m hoping that newer releases are there that actually work, it does have some promising capabilities, it just doesn’t quite come up to the standard of something that should be used in production. In contrast Puppet is almost a ten year old language with an active community! It’s very stable, and when trusting your business to configuration management, you want something that has been around awhile and has been proven. Give DSC another couple of releases and you might see more folks integrating with it. That said there may be a future with DSC integration. Portability and fine-grained reporting of configuration changes are reasons to take a closer look at Puppet on Windows. Yes, Puppet on Windows is here to stay and it’s continually getting better folks.

    Read the article

  • Announcing Oracle Audit Vault and Database Firewall

    - by Troy Kitch
    Today, Oracle announced the new Oracle Audit Vault and Database Firewall product, which unifies database activity monitoring and audit data analysis in one solution. This new product expands protection beyond Oracle and third party databases with support for auditing the operating system, directories and custom sources. Here are some of the key features of Oracle Audit Vault and Database Firewall: Single Administrator Console Default Reports Out-of-the-Box Compliance Reporting Report with Data from Multiple Source Types Audit Stored Procedure Calls - Not Visible on the Network Extensive Audit Details Blocking SQL Injection Attacks Powerful Alerting Filter Conditions To learn more about the new features in Oracle Audit Vault and Database Firewall, watch the on-demand webcast.

    Read the article

  • links for 2010-05-26

    - by Bob Rhubart
    @vambenempe - Dear Cloud API, your fault line is showing "I am talking about the dreadful state of fault reporting in remote APIs, from Twitter to Cloud interfaces. They are badly described in the interface documentation and the implementations often don’t even conform to what little is documented." -- William Vambenempe (tags: oracle otn cloud) @oraclebase: Consuming Web Services using PL/SQL Oracle ACE Director Tim Hall shares a couple of solutions for consuming web services using PL/SQL. (tags: oracle otn oracleace soa sql webservices) Douwe Pieter van den Bos: IT Project misstep: To Serve and Protect "Thoughts and vision change during time. We gain new insights and other people share their knowledge. This is exactly why software development projects need to be based on a change facilitating manner, not trying to avoid change, or make it more difficult." -- Douwe Pieter van den Bos (tags: oracle otn architect projectmanagement innovation)

    Read the article

  • My First Post @ geekswithblogs

    - by sathya
    Dear Friends, Here is my first post on geekswithblogs. I am happy that I have got a separate space here to blog. I am an MCTS certified Professional in .Net 2.0 Web applications, working as a Senior Software Engineer. Willing to share my knowledge on all topics whatever I know. I am also an active presenter / speaker in Microsoft Developer User Group HyderabadTechies. And I have presented many online sessions there. I keep myself updated on the latest technologies in Microsoft. You can see my posts here on the following subjects : C# ASP.NET SQL Server SQL Server Integration Services (SSIS) SQL Server Analysis Services (SSAS) SQL Server Reporting Services (SSRS) I have a personal blog too where I share my knowledge. Pls take a note of it. http://cybersathya.blogspot.com You can see me here often posting the updates on technologies and the technical challenges that I faced and the solutions for the same. Stay Tuned !!! Regards Sathya Narayanan Srinivasan

    Read the article

  • Well supported Hardware Raid Controller

    - by ftiaronsem
    Hello alltogether I am currently planning to buy a hardware-raid controller. This became necessary since I am running Linux and Windows in parallel and now need the redundancy for both OS (Im am going to use RAID1 / Mirroring). Therefore I am searching for a hardware raid controller which is well supported by linux / ubuntu (reporting smart values, stats for the harddrives, etc...). This controller should have four sata ports and if possible it should fit in a PCIE-1x Slot. I would greatly appreciate, if you could suggest some devices. Thanks in advance

    Read the article

  • SAS unifie la vision des risques pour les banques avec SAS Risk Management for Banking

    SAS unifie la vision des risques pour les banques Avec SAS Risk Management for Banking SAS lance SAS Risk Management for Banking, une solution intégrée dédiée à la gestion des risques dans le secteur de la banque. Cette solution exploite les fonctionnalités de la plate-forme SAS Business Analytics dans le domaine de l'intégration des données, de l'analyse et du reporting. « Ses fonctionnalités de gestion des risques répondent aux exigences en matière de normes réglementaires et de besoins de performance des différentes entités métier », souligne l'éditeur. SAS Risk Management for Banking couvre le processus complet allant de la gestion des données au reportin...

    Read the article

  • SQL Server Data Tools–BI for Visual Studio 2013 Re-released

    - by Greg Low
    Customers used to complain that the tooling for creating BI projects (Analysis Services MD and Tabular, Reporting Services, and Integration services) has been based on earlier versions of Visual Studio than the ones they were using for their other work in Visual Studio (such as C#, VB, and ASP.NET projects). To alleviate that problem, the shipment of those tools has been decoupled from the shipment of the SQL Server product. In SQL Server 2014, the BI tooling isn’t even included in the released version of SQL Server. This allows the team to keep up-to-date with the releases of Visual Studio. A little while back, I was really pleased to see that the Visual Studio 2013 update for SSDT-BI (SQL Server Data Tools for Business Intelligence) had been released. Unfortunately, they then had to be withdrawn. The good news is that they’re back and you can get the latest version from here: http://www.microsoft.com/en-us/download/details.aspx?id=42313

    Read the article

  • Az OTP Bank az Oracle Warehouse Builder-t használja

    - by Fekete Zoltán
    Az Oracle.com-on az ügyfél sikertörténetek között az imént jelent meg a következo dokumentum: OTP Bank Data Warehouse Development Team Improves Service Level and Lowers Reporting Lead Time for Business Fields by 80%, azaz az OTP Bank az adattárház fejlesztéshez az Oracle Warehouse Builder ETL-ELT eszközt használja. AZ OTP Bank Tranzakciós Adattárház fejleszto csapata magasabb minoségi szintre emelte a belso megrendeloknek nyújtott szoltáltatásait, amely egyik eredménye, hogy 80%-al csökkentette az üzletágak közötti riportolási folyamatok átfutási idotartamát. A magyar nyelvu sikertörténet innen töltheto le. A legfontosabb eredmények az OWB kapcsán: - ETL folyamatok sztenderdizációján keresztül elért adatminoség javulás, OWB - Oracle Business Intelligence EE: az üzleti területek és az IT fejlesztés közötti együttmoködés hatékonyabb - sztenderdizált ETL és riportolási folyamatok: - fix jelentés készletek hatására tudatos üzleti metaadat kezelés - egységes terminológia - komplex banki folyamatok pontos ismerete: üzleti területek és IT fejlesztok számára - hatékony banki együttmoködés - a megrendeléstol az adatpublikációig tartó folyamatok idotartama lecsökkent - az ad-hoc riportok elkészítése a korábbi 1,5 hétrol 80%-al, átlagosan 2 munkanapra csökkent

    Read the article

  • June 25 changes to BIS 742.15 How does it impact SSL iPhone App export compliance

    - by Rob
    This question isn't strictly development-related but I hope it's still acceptable :) On June 25, 2010 the BIS updated 742.15 and of interest to me is the new 742.14(b)(4) "Exclusions from mass market classification request, encryption registration and self-classification reporting requirements" and 742.15(b)(4)(ii) which states… (ii) Foreign products developed with or incorporating U.S.-origin encryption source code, components, or toolkits. Foreign products developed with or incorporating U.S. origin encryption source code, components or toolkits that are subject to the EAR, provided that the U.S. origin encryption items have previously been classified or registered and authorized by BIS and the cryptographic functionality has not been changed. Such products include foreign developed products that are designed to operate with U.S. products through a cryptographic interface. I take this to mean that my Canadian produced product that uses https is now excluded from requiring a CCATTS. What does everyone else think?

    Read the article

  • Oracle European Launch Event: Oracle Database In-Memory

    - by A&C Redaktion
    am 17. Juni wird Andy Mendelsohn,  Oracle Executive Vice President Database Server Technologies, als Keynote Speaker den europäischen Launch von  Oracle Database In-Memory  in Frankfurt im Radisson Blue Hotel eröffnen. Dieses Thema ist für unsere Partner und Kunden von zentraler Bedeutung. Daher ist auch die Agenda dieses Launch Events einzigartig:Neben Vorträgen von Betakunden (Postbank und Cern), dem Analysten von IDC, der auch beim HQ Launch eine Woche zuvor in Redwood Shores mit Larry Ellison auf der Bühne stehen wird, und weiteren Oracle-Experten finden auch Live Demos statt. Eine Podiumsdiskussion rundet das Programm ab. Parallel zum Event werden Presse- und Analystengespräche geführt.Mit der neuen, bahnbrechenden Oracle Database In-Memory Option profitieren Kunden von einer erheblich beschleunigten Datenbankleistung für Analytics, Data Warehousing, Reporting und Online Transaction Processing (OLTP).Das ist so revolutionär, dass wir hiermit alle unsere Partner und ihre Endkunden zu diesem herausragenden Event herzlich einladen .Hier können Sie sich und Ihre Endkunden zu dieser exklusiven Live-Veranstaltung anmelden

    Read the article

< Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >