Search Results

Search found 20857 results on 835 pages for 'technical support'.

Page 605/835 | < Previous Page | 601 602 603 604 605 606 607 608 609 610 611 612  | Next Page >

  • Data Mining Resources

    - by Dejan Sarka
    There are many different types of analyses, each one with its own pros and cons. Relational reports have a predefined structure, and end users cannot change it. They are simple to use for end users. Reports can use real-time data and snapshots of data to show the state of a report at specific points in time. One of the drawbacks is that report authoring is limited to IT pros and advanced users. Any kind of dynamic restructuring is very limited. If real-time data is used for a report, the report has a negative impact on the performance of the source system. Processing of the reports might be slow because the data comes from relational database management systems, which are not optimized for reporting only. If you create a semantic model of your data, your end users can create ad-hoc report structures. However, the development is more complex because a developer is needed to create these semantic models. For OLAP, you typically use specialized database management systems. You get lightning speed of analyses. End users can use rich and thin clients to interactively change the structure of the report. Typically, they do it graphically. However, the development of an OLAP system is many times quite complex. It involves the preparation and maintenance of an enterprise data warehouse and OLAP cubes. In order to exploit the possibility of real-time restructuring of reports, the users must be both active and educated. The data is usually stale, as it is loaded into data warehouses and OLAP cubes with a scheduled process. With data mining, a structure is not selected in advance; it searches for the structure. As a result, data mining can give you the most valuable results because you can discover patterns you did not expect. A data mining model structure is limited only by the attributes that you use to train the model. One of the drawbacks is that a lot of knowledge is needed for a successful data mining project. End users have to understand the results. Subject matter experts and IT professionals need to understand business problem thoroughly. The development might be sometimes even more complex than the development of OLAP cubes. Each type of analysis has its own place in an enterprise system. SQL Server has tools for all kinds of analyses. However, data mining is the most advanced way of analyzing the data; this is the “I” in BI. In order to get the most out of it, you need to learn quite a lot. In this blog post, I am gathering together resources for learning, including forthcoming events. Books Multiple authors: SQL Server MVP Deep Dives – I wrote an introductory data mining chapter there. Erik Veerman, Teo Lachev and Dejan Sarka: MCTS Self-Paced Training Kit (Exam 70-448): Microsoft SQL Server 2008 - Business Intelligence Development and Maintenance – you can find a good overview of a complete BI solution, including data mining, in this book. Jamie MacLennan, ZhaoHui Tang, and Bogdan Crivat: Data Mining with Microsoft SQL Server 2008 – can’t miss this book if you want to mine your data with SQL Server tools. Michael Berry, Gordon Linoff: Mastering Data Mining: The Art and Science of Customer Relationship Management – data mining from both, business and technical perspective. Dorian Pyle: Data Preparation for Data Mining – an in-depth book about data preparation. Thomas and Ronald Wonnacott: Introductory Statistics – if you thought that you could get away without statistics, then you are not serious about data mining. Jiawei Han and Micheline Kamber: Data Mining Concepts and Techniques – in-depth explanation of the most popular data mining algorithms. Michael Berry and Gordon Linoff: Data Mining Techniques – another book that explains data mining algorithms, more fro a business perspective. Paolo Guidici: Applied Data Mining – very mathematical book, only if you enjoy statistics and mathematics in general. Forthcoming presentations I am presenting two data mining related sessions during the PASS Summit in Charlotte, NC: Wednesday, October 16th, 2013 - Fraud Detection: Notes from the Field – I am showing how to use data mining for a specific business problem. The presentation is based on real-life projects. Friday, October 18th: Excel 2013 Advanced Analytics – I am focusing on Excel Data Mining Add-ins, and how to use them together with Power Pivot and other add-ins. This is the most you can get out of Excel. Sinergija 2013, Belgrade, Serbia Tuesday, October 22nd: Excel 2013 Analytics to the Max – another presentation focusing on the most advanced analytics you can get in Excel. SQL Rally Amsterdam, Netherlands Thursday, November 7th: Advanced Analytics in Excel 2013 – and again I am presenting about data mining in Excel. Why three different titles for the same presentation? I don’t know, I guess I forgot the name I proposed every time right after I sent the proposal. Courses Data Mining with SQL Server 2012 – I wrote a 3-day course for SolidQ. If you are interested in this course, which I could also deliver in a shorter seminar way, you can contact your closes SolidQ subsidiary, or, of course, me directly on addresses [email protected] or [email protected]. This course could also complement the existing courseware portfolio of training providers, which are welcome to contact me as well. OK, now you know: no more excuses, start learning data mining, get the most out of your data

    Read the article

  • Variables versus constants versus associative arrays in PHP

    - by susmits
    I'm working on a small project, and need to implement internationalization support somehow. I am thinking along the lines of using constants to define a lot of symbols for text in one file, which could be included subsequently. However, I'm not sure if using variables is faster, or if I can get away with using associative arrays without too much of a performance hit. What's better for defining constant values in PHP, performance-wise -- constants defined using define("FOO", "..."), or simple variables like $foo = "...", or associative arrays like $symbols["FOO"]?

    Read the article

  • Crystal report using Linq-to-sql

    - by DATT OZA
    Hello, I am bit confusing in generating crystal report from linq-to-sql object in my WpfApplication. crystalreport1 rpt = new crystalreport1(); datacontextclass1 db = new datacontextclass1(); var q = (from records in db.emp select records).toList(); rpt.setDataSource(q); crystalviewer.reportsource(rpt); I have done above steps... but its prompts error NotSupportedException Was Unhandled Dataset not support system.nullable< please help... thanx in advance..

    Read the article

  • WCF performance improvements

    - by Burt
    I am developing a WPF application that talks to a server via WCF services over the internet. After profiling the application I noticed a lot of time is being taking up by creating the appropriate WCF client proxy and making the call to the server. The code on the server is optimised and doesn't take any time to run yet I am still seeing a 1.5 second delay from when a service is invloked to it returning to the client. A few points to give a bit of background: I am using the ASP.Net membership for security I will eventually hook into the same server side code through a website I would eventually like to have offline support in the application I really need to nail the performance early though as if the app is taking a couple of seconds to come back it is too long for what I am trying to do. Can anyone suggest performance tips that will help me please?

    Read the article

  • Aaron Hillegass Chapter 18 Challenge Question

    - by jasonbogd
    I am working through Aaron Hillegass' Cocoa Programming for Mac OS X and am doing the challenge for Chapter 18. Basically, the challenge is to write an app that can draw ovals using your mouse, and then additionally, add saving/loading and undo support. I'm trying to think of a good class design for this app that follows MVC. Here's what I had in mind: Have a NSView-subclass that represents an oval (say JBOval) that I can use to easily draw an oval. Have a main view (JBDrawingView) that holds JBOvals and draws them. The thing is that I wasn't sure how to add archiving. Should I archive each JBOval? I think this would work, but archiving an NSView doesn't seem very efficient. Any ideas on a better class design? Thanks.

    Read the article

  • Product table, many kinds of product, each product has many parameters

    - by StoneHeart
    Hi, i'm have not much experience in table design. My goal is a product table(s), it must design to fix some requirement below: Support many kind of products (TV, Phone, PC, ...). Each kind of product has different set of parameters like: Phone will have Color, Size, Weight, OS... PC will have CPU, HDD, RAM... Set of parameters must be dynamic. You can add or edit any parameter you like. I don't want make a table for each kind of product. So I need help to find a correct solution. Thanks.

    Read the article

  • Check for Block Ads/Scripts (Browser Addons, Compatibility)

    - by acidzombie24
    I'm conflicted. you guys decide if this should migrate to SU or not. I would like to test my site against popular browser ad ons. ATM i have tested against noscript and adblock plus for firefox. What other popular ad ons should i check compatibility with? By compatibility i mean to work as intent on browsers i support (opera, firefox, chrome, IE 7/8) which include ads. NoScript broke my site and for adblock plus i ask once per week to consider allowing ads. When i see IE6 i notify the user the site is known to be unusable with that browser (The site is script heavy by nature and i wouldnt want to accidentally serve ads to infect users of IE6 with a virus).

    Read the article

  • Implement a HierarchicalDataBoundControl for ASP.NET

    - by Breadtruck
    I want to implement a Hierarchical data bound control for ASP.NET. I used Implementing IHierarchy Support Into Your Custom Collection as a basis to create my hierarchical collection. I also created a HierarchicalDataSourceControl for the collection. Everything works: I can bind it to an ASP.NET TreeView and it renders the data correctly. However, I'm stuck on the HierarchicalDataBoundControl. I found examples, but I am still too new at c# / .Net to understand them. I don't quite understand how to implement the examples: Rendering a databound UL menu nor HierarchicalDataBoundControl Class Does anyone have any tips, or better examples of implementing a control like this?

    Read the article

  • SQL Server Reporting Services - Fast TimeDataRetrieval - Long TimeProcessing

    - by user197529
    An application that I support has recently begun experiencing extended periods of time required to execute a report in SQL Server Reporting Services. The reports that are being executed are not terribly complex. There are multiple stored procedures (between 5 and 8) which return anywhere from a handful to 8000 records total. Reports are generally from 2 to 100 pages. One can argue (and I have) the benefit of a 100 page report, but the client is footing the bill. At any rate, the problem is that even the reports with 500 records (11 pages) being returned takes 5 minutes to return to the browser. In the execution log the TimeDataRetrieval is 60 seconds, but the TimeProcessing is 235 seconds. It seems bizarre to me that my query runs so quickly, but it takes Reporting Services so long to process the data. Any suggestions are greatly appreciated. Kind Regards, Bernie

    Read the article

  • Java library for HTML analysis

    - by Raj
    Hi, (I've seen similar questions, but I think none of them cater to my specific needs, hence...) I would like to know if there is a Java library for analysis of real-world (read: incomplete, ill-formed) HTML. By analysis, I mean things like: figuring out the most prominent color in an HTML chunk changing that color to some other color (hence, has to support modification of the HTML as well) pruning out unwanted tags fixing up the HTML to result in a well formed HTML snippet Parts of the last two are done by libraries such as Jericho, and jTidy. 'Plugins' on top of these would be great. Thanks in advance!

    Read the article

  • High volume SVM (machine learning) system

    - by flyingcrab
    I working on a possible machine learning project that would be expected to do high speed computations for machine learning using SVM (support vector machines) and possibly some ANN. I'm resonably comfortable working on matlab with these, but primarly in small datasets, just for experimentation. I'm wondering if this matlab based approach will scale? or should i be looking into something else? C++ / gpu based computing? java wrapping of the matlab code and pushing it onto app engine? Incidentally, there seems to be a lot fo literature on GPUs, but not much on how useful they are on machine learning applications using matlab, & the cheapest CUDA enlabled GPU money can buy? is it even worth the trouble?

    Read the article

  • Which web containers install themselves well as a Windows service?

    - by Thorbjørn Ravn Andersen
    We have had a web application product for several years, and used Tomcat to deploy it under Windows as it registers itself as a Windows service so it starts and stops automatically. We may now happen to need more JEE facilitites than is provided by Tomcat (we are very tempted by the JEE 6 things in the container) so the question is which Open Source JEE containers works well as Windows services. Since Glassfish is the only JEE 6 implementation right now, it would be nice if it works well, but I'd like to hear experiences and not just what I can read from brochures. If not, what else do people use? EDIT: This goes for web containers too, and not just JEE containers. We will probably keep the necessary stack included until we find the right container and it gets JEE6 support. EDIT: I want this to work as distributed. I'm not interested in manually hacking wrappers etc., but want the installation process to handle the creation and removal of the service.

    Read the article

  • Logging strategy vs. performance

    - by vtortola
    Hi, I'm developing a web application that has to support lots of simultaneous requests, and I'd like to keep it fast enough. I have now to implement a logging strategy, I'm gonna use log4net, but ... what and how should I log? I mean: How logging impacts in performance? is it possible/recomendable logging using async calls? Is better use a text file or a database? Is it possible to do it conditional? for example, default log to the database, and if it fails, the switch to a text file. What about multithreading? should I care about synchronization when I use log4net? or it's thread safe out of the box? In the requirements appear that the application should cache a couple of things per request, and I'm afraid of the performance impact of that. Cheers.

    Read the article

  • Is there any real benefit to using ASP.Net Authentication with ASP.Net MVC?

    - by alchemical
    I've been researching this intensely for the past few days. We're developing an ASP.Net MVC site that needs to support 100,000+ users. We'd like to keep it fast, scalable, and simple. We have our own SQL database tables for user and user_role, etc. We are not using server controls. Given that there are no server controls, and a custom membershipProvider would need to be created, where is there any benefit left to use ASP.Net Auth/Membership? The other alternative would seem to be to create custom code to drop a UniqueID CustomerID in a cookie and authenticate with that. Or, if we're paranoid about sniffers, we could encrypt the cookie as well. Is there any real benefit in this scenario (MVC and customer data is in our own tables) to using the ASP.Net auth/membership framework, or is the fully custom solution a viable route?

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you’ll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you’ve read my previous blog posts, you’ll be aware that I’ve been focusing on the database continuous integration theme. In my CI setup I create a “production”-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it’s not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn’t I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn’t an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley’s “Continuous Delivery” teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you’ve been allotted. 2. It’s not just about the storage requirements, it’s also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I’m just not going to get the feedback quickly enough to react. So what’s the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I’m sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server’s point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no ‘duplicate’ storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly “release test” process triggered by my CI tool. RESTORE DATABASE WidgetProduction_Virtual FROM DISK=N'D:\VirtualDatabase\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE WidgetProduction_Virtual WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the ‘virtual’ restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • installshield: Windir returns c:\documents & settings\fcuser\windows instead of c:\windows

    - by Sakhawat Ali
    we have a setup developed in installshield vr 6.3. it is a self extractable single setup. it work fine in most on most of the Windows version but on Windows server 2003 64bit in Execution mode when doing RD it return user windows directory against WINDIR i.e. c:\documents & settings\fcuser\windows instead of C:\Windows. According to http://support.microsoft.com/?kbid=186499 it should work fine when i change the compatibility bit of Setup but it didn't. i tried changing compatibility bit of these key too (INSTRUN, SETUP and SETUP1 ) but it didn't work either. but when i when i run the setup within the self extractable by extracting it work fine.

    Read the article

  • Spring 3 JSON with MVC

    - by stevedbrown
    Is there a way to build Spring Web calls that consume and produce application/json formatted requests and responses respectively? Maybe this isn't Spring MVC, I'm not sure. I'm looking for Spring libraries that behave in a similar fashion to Jersey/JSON. The best case would be if there was an annotation that I could add to the Controller classes that would turn them into JSON service calls. A tutorial showing how to build Spring Web Services with JSON would be great. EDIT: I'm looking for an annotation based approach (similar to Jersey). EDIT2: Like Jersey, I am looking for REST support (POST,GET,DELETE,PUT). EDIT3: Most preferably, this will be the pom.xml entries and some information on using the spring-js with jackson Spring native version of things.

    Read the article

  • Not so long ago in a city not so far away by Carlos Martin

    - by Maria Sandu
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 This is the story of how the EMEA Presales Center turned an Oracle intern into a trusted technology advisor for both Oracle’s Sales and customers. It was the summer of 2011 when I was finishing my Computer Engineering studies as well as my internship at Oracle when I was offered what could possibly be THE dream job for any young European Computer Engineer. Apart from that, it also seemed like the role was particularly tailored to me as I could leverage almost everything I learned at University and during the internship. And all of it in one of the best cities to live in, not only from my home country but arguably from Europe: Malaga! A day at EPC As part of the EPC Technology pillar, and later on completely focused on WebCenter, there was no way to describe a normal day on the job as each day had something unique. Some days I was researching documentation in order to elaborate accurate answers for a customer’s question within a Request for Information or Proposal (RFI/RFP), other days I was doing heavy programming in order to bring a Proof of Concept (PoC) for a customer to life and last not but least, some days I presented to the customer via webconference the demo I built for them the past weeks. So as you can see, the role has research, development and presentation, could you ask for more? Well, don’t worry because there IS more! Internationality As the organization’s name suggests, EMEA Presales Center, it is the Center of Presales within Europe, Middle East and Africa so I got the chance to work with great professionals from all this regions, expanding my network and learning things from one country to apply them to others. In addition to that, the teams based in the Malaga office are comprised of many young professionals hailing mainly from Western and Central European countries (although there are a couple of exceptions!) with very different backgrounds and personalities which guaranteed many laughs and stories during lunch or coffee breaks (or even while working on projects!). Furthermore, having EPC offices in Bucharest and Bangalore and thanks to today’s tele-presence technologies, I was working every day with people from India or Romania as if they were sitting right next to me and the bonding with them got stronger day by day. Career development Apart from the research and self-study I’ve earlier mentioned, one of the EPC’s Key Performance Indicators (KPI) is that 15% of your time is spent on training so you get lots and lots of trainings in order to develop both your technical product knowledge and your presentation, negotiation and other soft skills. Sometimes the training is via webcast, sometimes the trainer comes to the office and sometimes, the best times, you get to travel abroad in order to attend a training, which also helps you to further develop your network by meeting face to face with many people you only know from some email or instant messaging interaction. And as the months go by, your skills improving at a very fast pace, your relevance increasing with each new project you successfully deliver, it’s only a matter of time (and a bit of self-promoting!) that you get the attention of the manager of a more senior team and are offered the opportunity to take a new step in your professional career. For me it took 2 years to move to my current position, Technology Sales Consultant at the Oracle Direct organization. During those 2 years I had built a good relationship with the Oracle Direct Spanish sales and sales managers, who are also based in the Malaga office. I supported their former Sales Consultant in a couple of presentations and demos and were very happy with my overall performance and attitude so even before the position got eventually vacant, I got a heads-up from then in advance that their current Sales Consultant was going to move to a different position. To me it felt like a natural step, same as when I joined EPC, I had at least a 50% of the “homework” already done but wanted to experience that extra 50% to add new product and soft skills to my arsenal. The rest is history, I’ve been in the role for more than half a year as I’m writing this, achieved already some important wins, gained a lot of trust and confidence in front of customers and broadened my view of Oracle’s Fusion Middleware portfolio. I look back at the 2 years I spent in EPC and think: “boy, I’d recommend that experience to absolutely anyone with the slightest interest in IT, there are so many different things you can do as there are different kind of roles you can end up taking thanks to the experience gained at EPC” /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • NETWORK_ERROR: XMLHttpRequest Exception 101

    - by pawan Mangal
    I am getting this Error NETWORK_ERROR: XMLHttpRequest Exception 101 when trying to get XML content from one site. Here is my code var xmlhttp; if(window.XMLHttpRequest) { xmlhttp = new XMLHttpRequest(); } if (xmlhttp==null) { alert ("Your browser does not support XMLHTTP!"); return; } xmlhttp.onReadyStateChange=function() { if(xmlhttp.readyState==4) { var value =xmlhttp.responseXML; alert(value); } } xmlhttp.open("GET",url,false); xmlhttp.send(); //alert(xmlhttp.responseXML); } xmlhttp.open("GET",url,false); xmlhttp.send(null); Does any one have a solution?

    Read the article

  • Any hosted versions of jQuery that have the 'Access-Control-Allow-Origin: *' header set?

    - by Greg Bray
    I have been working with jQuery recently and ran into a problem where I couldn't include it in a userscript because xmlhttpRequest uses the same origin policy. After further testing I found that most browsers also support the Cross-Origin Resource Sharing access control defined by W3C as a workaround for issues with same origin policy. I tested this by hosting the jQuery script on a local web server that included the Access-Control-Allow-Origin: * http header, which allowed the script to be downloaded using xmlhttpRequest so that it could be included in my userscript. I'd like to use a hosted version of jQuery when releasing the script, but so far testing with tools like http://www.seoconsultants.com/tools/headers I have not found any sites that allow cross-origin access to the jQuery script. Here is the list I have tested so far: http://www.asp.net/ajaxlibrary/CDN.ashx http://code.google.com/apis/ajaxlibs/documentation/index.html#jquery http://docs.jquery.com/Downloading_jQuery#CDN_Hosted_jQuery Are there any other hosted versions of jQuery that do allow cross origin access?

    Read the article

  • Put DrawingGroup on a Canvas?

    - by stefan.at.wpf
    Hello, I have a DrawingGroup and I want to put it on a Canvas, but because DrawingGroup is no UIElement, this is not possible. Whats the best way to do this? And from which class could I derive, so that I could do something like canvas1.Children.Add(new myDrawingGroup()); (Meaning I want to add my DrawingGroup as one element on the canvas, instead of several single Drawings / Geometries. I also need hit testing and databinding support.) Thank you very much for any hint!

    Read the article

  • Windows 64-bit registry v.s. 32-bit registry

    - by George2
    Hello everyone, I heard on Windows x64 architecture, in order to support to run both x86 and x64 application, there is two separate/different sets of Windows registry -- one for x86 application to access and the other for x64 application to access? For example, if a COM registers CLSID in the x86 set of registry, then x64 application will never be able to access the COM component by CLSID, because x86/x64 have different sets of registry? So, my question is whether my understanding of the above sample is correct? I also want to get some more documents to learn this topic, about the two different sets of registry on x64 architecture. (I did some search, but not found any valuable information.) thanks in advance, George

    Read the article

  • JQuery UI Dialog - Ajax Update on success $(this).dialog('close');

    - by Dan
    Having issues referencing $(this) from within a the nested ajax 'success' function... I know this is a scope issue, but can't seem to find a clean way to close the dialog on a successful update. Thanks for any help. $("#dialog_support_option_form").dialog({ width: 400, height: 180, bgiframe: true, autoOpen: false, modal: true, buttons: { 'Save Support Option': function(){ $.ajax({ type: 'POST', url: "support_options/create_support_option.php", data: $(this).find('form').serialize(), success: function(data){ $("#list_support_options").html(data); $(this).dialog('close'); } }); }, 'Cancel': function(){ $(this).dialog('close'); } }, close: function(){ $(this).find('input').val(''); } });

    Read the article

  • how to group by over anonymous type with vb.net linq to object

    - by smoothdeveloper
    I'm trying to write a linq to object query in vb.net, here is the c# version of what I'm trying to achieve (I'm running this in linqpad): void Main() { var items = GetArray( new {a="a",b="a",c=1} , new {a="a",b="a",c=2} , new {a="a",b="b",c=1} ); ( from i in items group i by new {i.a, i.b} into g let p = new{ k = g, v = g.Sum((i)=>i.c)} where p.v > 1 select p ).Dump(); } // because vb.net doesn't support anonymous type array initializer, it will ease the translation T[] GetArray<T>(params T[] values){ return values; } I'm having hard time with either the group by syntax which is not the same (vb require 'identifier = expression' at some places, as well as with the summing functor with 'expression required' ) Thanks so much for your help!

    Read the article

< Previous Page | 601 602 603 604 605 606 607 608 609 610 611 612  | Next Page >