Search Results

Search found 11543 results on 462 pages for 'partition wise join'.

Page 219/462 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • A temporary disagreement

    - by Tony Davis
    Last month, Phil Factor caused a furore amongst some MVPs with an article that attempted to offer simple advice to developers regarding the use of table variables, versus local and global temporary tables, in their code. Phil makes clear that the table variables do come with some fairly major limitations.no distribution statistics, no parallel query plans for queries that modify table variables.but goes on to suggest that for reasonably small-scale strategic uses, and with a bit of due care and testing, table variables are a "good thing". Not everyone shares his opinion; in fact, I imagine he was rather aghast to learn that there were those felt his article was akin to pulling the pin out of a grenade and tossing it into the database; table variables should be avoided in almost all cases, according to their advice, in favour of temp tables. In other words, a fairly major feature of SQL Server should be more-or-less 'off limits' to developers. The problem with temp tables is that, because they are scoped either in the procedure or the connection, it is easy to allow them to hang around for too long, eating up precious memory and bulking up the shared tempdb database. Unless they are explicitly dropped, global temporary tables, and local temporary tables created within a connection rather than within a stored procedure, will persist until the connection is closed or, with connection pooling, until the connection is reused. It's also quite common with ASP.NET applications to have connection leaks, as Bill Vaughn explains in his chapter in the "SQL Server Deep Dives" book, meaning that the web page exits without closing the connection object, maybe due to an error condition. This will then hang around in the heap for what might be hours before picked up by the garbage collector. Table variables are much safer in this regard, since they are batch-scoped and so are cleaned up automatically once the batch is complete, which also means that they are intuitive to use for the developer because they conform to scoping rules that are closer to those in procedural code. On the surface then, an ideal way to deal with issues related to tempdb memory hogging. So why did Phil qualify his recommendation to use Table Variables? This is another of those cases where, like scalar UDFs and table-valued multi-statement UDFs, developers can sometimes get into trouble with a relatively benign-looking feature, due to way it's been implemented in SQL Server. Once again the biggest problem is how they are handled internally, by the SQL Server query optimizer, which can make very poor choices for JOIN orders and so on, in the absence of statistics, especially when joining to tables with highly-skewed data. The resulting execution plans can be horrible, as will be the resulting performance. If the JOIN is to a large table, that will hurt. Ideally, Microsoft would simply fix this issue so that developers can't get burned in this way; they've been around since SQL Server 2000, so Microsoft has had a bit of time to get it right. As I commented in regard to UDFs, when developers discover issues like with such standard features, the database becomes an alien planet to them, where death lurks around each corner, and they continue to avoid these "killer" features years after the problems have been eventually resolved. In the meantime, what is the right approach? Is it to say "hammers can kill, don't ever use hammers", or is it to try to explain, as Phil's article and follow-up blog post have tried to do, what the feature was intended for, why care must be applied in its use, and so enable developers to make properly-informed decisions, without requiring them to delve deep into the inner workings of SQL Server? Cheers, Tony.

    Read the article

  • Willy Rotstein on Supply Chain Planning

    - by sarah.taylor(at)oracle.com
    Each time a merchandiser, buyer or planner in Retail makes a business decision around assortment, inventory, pricing and promotions there is an opportunity to improve both Profitability and Customer Service. Improving decision making, however, has always been a tricky business for retailers.  I have worked in this space for more than 15 years. I began my career as an academic, at Imperial College London, and then broadened this interest with Retailers, aiming to optimize their merchandising and supply chain decisions. Planning the business and optimizing profit is a complex process. The complexity arises from the variety of people involved, the large number of decisions to take across all business processes, the uncertainty intrinsic to the retail environment as well as the volume of data available for analysis.  Things are not getting any easier either. The advent of multi-channel, social media and mobile is taking these complexities to a new level and presenting additional opportunities for those willing to exploit them. I guess it is due to the complexities of the decision making process that, over the last couple of years working with Oracle Retail, I have witnessed a clear trend around the deployment of planning systems. Retailers are aiming to simplify their decision making processes. They want to use one joined up planning platform across the business and enhance it with "actionable" data mining and optimization techniques. At Oracle Retail, we have a vibrant community of international retailers who regularly come together to discuss the big issues in retail planning. It is a combination of fashion, grocery and speciality retailers, all sharing their best practice vision for planning and optimizing merchandise decisions. As part of the Retail Exchange program, at the recent National Retail Federation event in New York, I jointly hosted a Planning dinner with Peter Fitzgerald from Google UK, Retail Division. Those retailers from our international planning community who were in New York for the annual NRF event were able to attend. The group comprised some of Europe's great International Retail brands.  All sectors were represented by organisations like Mango, LVMH, Ahold, Morrisons, Shop Direct and River Island. They confirmed the current importance of engaging with Planning and Optimization issues. In particular the impact of the internet was a key topic. We had a great debate about new retail initiatives.  Peter highlighted how mobility is changing retail - in particular with the new "local availability search" initiative. We also had an exciting discussion around the opportunities to improve merchandising using the new data that is becoming available from search, social media and ecommerce sites. It will be our focus to continue to help retailers translate this data into better results while keeping their business operations simple. New developments in "actionable" analytics and computing capacity make this a very exciting area today. Watch this space for my contributions on these topics which will be made available through this blog. Oracle Retail has a strong Planning community. if you are a category manager, a planner, a buyer, a merchandiser, a retail supplier or any retail executive with a keen interest in planning then you would be very welcome to join Oracle Retail's Planning Community. As part of our community you will be able to join our in-person and virtual events, download topical white papers and best practice information specifically tailored to your area of interest.  If anyone would like to register their interest in joining our community of retailers discussing planning then please contact me at [email protected]   Willy Rotstein, Oracle Retail

    Read the article

  • The Java Community Process: What's Broken and How to Fix It

    - by Tori Wieldt
    In a panel discussion today at TheServerSide Java Symposium, Patrick Curran, Head of the Java Community Process, James Gosling, and ?Reza Rahman, member, Java EE 6 and EJB 3.1 expert groups, discussed the state of the JCP. Moderated by Cameron McKenzie, Editor of TheServerSide.com, they discussed what's wrong with JCP and ways to fix it.What's wrong with the JCP? Reza Rahman was quite supportive of the JCP. "I work as a consultant, and it's much better than getting a decision made a large company," Reza commented. He gave the JCP "Five stars" and explained that as an individual, he was able to have an impact on things that mattered to him. Cameron asked, "Now all these JCP problems came after Oracle acquired Sun, right?" To which the crowd had a good laugh, and the panel all agreed many of the JCP problems existed under Sun. How is the JCP handled differently under Oracle than Sun? "Pretty similar," said James. Oracle "tends more towards practicality" said Reza. "I'm glad to see things moving again, we've got several new JSRs filed," Patrick commented.How to Fix It?They all agreed greater transparency is a top issue. Without it, people assume sinister behavior whether it's there or not. Patrick said that currently spec leads are "encouraged" to be transparent, and the JCP office is planning to submit JSRs to change the JCP process so transparency is mandated, both for mailing lists and issue tracking. Shining a light on problems is the best way to fix them.Reza said the biggest problem is lack of a participation from the community. If more people are involved, a lot of the problems go away. "Developers are too non-chalant, they should realize what happens in the JCP has an direct impact on their career and they need to get involved." Reza commented.Got Involved!During Q&A, someone asked how a developer could get involved. They answered: Pick a JSR you are interested in and follow it. To start, you could read an article about the JSR and comment on the article (expert group members do read the comments). Or read the spec, discuss it with others and post a blog about it. Read the Expert Group proceedings. Join the JCP (free for individuals). Open source projects have code that you can download and play with, download it and provide feedback. Patrick mentioned that the JCP really wants more participation. "One way we are working on it is that we are encouraging JUGs to join the JCP as a group, and that makes all members of the JUG JCP members," Patrick said.They commented that most spec leads are desperate for feedback. "And, please get involved BEFORE the spec is finalized!" James declared. Someone from the audience said it's hard to put valuable time into something before it's baked. Patrick explained that Post Final Draft (PFD) is the time in the JCP process when the spec is mature enough to review but before the spec is finalized. The panel agreed the worst thing that could happen is that most people in the Java community just complain about the JCP without getting involved. Developer Sumit Goyal, conference attendee, thought it was a healthy discussion. "I got insights into how JSRs are worked on and finalized," he said.Key LinksThe Java Community Process Website  http://jcp.org/en/home/indexArticle: A Conversation with JCP Chair Patrick Curran Oracle Technology Network http://www.oracle.com/technetwork/java/index.htmlTheServerSide Java Symposium  http://javasymposium.techtarget.com/

    Read the article

  • Removing old kernel entries in Grub

    - by To Do
    I regularly delete old kernels leaving only the latest two entries using Synaptic. I'm using Precise. However in my Grub "previous Linux version" menu there are quite a few entries labelled 2.6.8. I cannot find these linux-images in Synaptic. dpkg -l | grep linux-image Gives: rc linux-image-3.0.0-17-generic 3.0.0-17.30 Linux kernel image for version 3.0.0 on x86/x86_64 ii linux-image-3.2.0-27-generic 3.2.0-27.43 Linux kernel image for version 3.2.0 on 32 bit x86 SMP ii linux-image-3.2.0-29-generic 3.2.0-29.46 Linux kernel image for version 3.2.0 on 32 bit x86 SMP ii linux-image-3.4.0-030400-generic 3.4.0-030400.201205210521 Linux kernel image for version 3.4.0 on 32 bit x86 SMP ii linux-image-generic 3.2.0.29.31 Generic Linux kernel image Sudo update-grub gives: Generating grub.cfg ... Found linux image: /boot/vmlinuz-3.4.0-030400-generic Found initrd image: /boot/initrd.img-3.4.0-030400-generic Found linux image: /boot/vmlinuz-3.2.0-29-generic Found initrd image: /boot/initrd.img-3.2.0-29-generic Found linux image: /boot/vmlinuz-3.2.0-27-generic Found initrd image: /boot/initrd.img-3.2.0-27-generic Found linux image: /boot/vmlinuz-2.6.38-11-generic Found initrd image: /boot/initrd.img-2.6.38-11-generic Found linux image: /boot/vmlinuz-2.6.38-10-generic Found initrd image: /boot/initrd.img-2.6.38-10-generic Found linux image: /boot/vmlinuz-2.6.38-8-generic Found initrd image: /boot/initrd.img-2.6.38-8-generic Found memtest86+ image: /boot/memtest86+.bin Found Windows Vista (loader) on /dev/sda1 sudo apt-get remove linux-image-2.6.8-8-generic gives: E: Unable to locate package linux-image-2.6.8-8-generic E: Couldn't find any package by regex 'linux-image-2.6.8-8-generic' My boot folder contains the following: abi-2.6.38-10-generic initrd.img-3.4.0-030400-generic abi-2.6.38-11-generic memtest86+.bin abi-2.6.38-8-generic memtest86+_multiboot.bin abi-3.2.0-27-generic System.map-2.6.38-10-generic abi-3.2.0-29-generic System.map-2.6.38-11-generic abi-3.4.0-030400-generic System.map-2.6.38-8-generic config-2.6.38-10-generic System.map-3.2.0-27-generic config-2.6.38-11-generic System.map-3.2.0-29-generic config-2.6.38-8-generic System.map-3.4.0-030400-generic config-3.2.0-27-generic vmcoreinfo-2.6.38-10-generic config-3.2.0-29-generic vmcoreinfo-2.6.38-11-generic config-3.4.0-030400-generic vmcoreinfo-2.6.38-8-generic extlinux vmlinuz-2.6.38-10-generic grub vmlinuz-2.6.38-11-generic initrd.img-2.6.38-10-generic vmlinuz-2.6.38-8-generic initrd.img-2.6.38-11-generic vmlinuz-3.2.0-27-generic initrd.img-2.6.38-8-generic vmlinuz-3.2.0-29-generic initrd.img-3.2.0-27-generic vmlinuz-3.4.0-030400-generic initrd.img-3.2.0-29-generic and ls -l /etc/grub.d yields: total 56 -rwxr-xr-x 1 root root 6715 Apr 17 20:16 00_header -rwxr-xr-x 1 root root 5522 Oct 1 2011 05_debian_theme -rwxr-xr-x 1 root root 7407 May 17 09:22 10_linux -rwxr-xr-x 1 root root 6335 Apr 17 20:16 20_linux_xen -rwxr-xr-x 1 root root 1588 May 3 2011 20_memtest86+ -rwxr-xr-x 1 root root 7603 Apr 17 20:16 30_os-prober -rwxr-xr-x 1 root root 214 Oct 1 2011 40_custom -rwxr-xr-x 1 root root 95 Oct 1 2011 41_custom -rw-r--r-- 1 root root 483 Oct 1 2011 README gdisk -l /dev/sda yields: Partition table scan: MBR: MBR only BSD: not present APM: not present GPT: not present *************************************************************** Found invalid GPT and valid MBR; converting MBR to GPT format. *************************************************************** Disk /dev/sda: 312581808 sectors, 149.1 GiB Logical sector size: 512 bytes Disk identifier (GUID): F832A498-05E1-4615-B5B1-757ACB4A757A Partition table holds up to 128 entries First usable sector is 34, last usable sector is 312581774 Partitions will be aligned on 2048-sector boundaries Total free space is 4183661 sectors (2.0 GiB) Number Start (sector) End (sector) Size Code Name 1 2048 61442047 29.3 GiB 0700 Microsoft basic data 3 163842048 169986047 2.9 GiB 8200 Linux swap 4 169986048 312578047 68.0 GiB 0700 Microsoft basic data 5 61444096 159666175 46.8 GiB 8300 Linux filesystem Please help with removing the old and inexistent kernels from Grub.

    Read the article

  • Copy TFS Build Definitions between Projects and Collections

    - by Jakob Ehn
    Originally posted on: http://geekswithblogs.net/jakob/archive/2014/06/05/copy-tfs-build-definitions-between-projects-and-collections.aspxThe last couple of years it has become apparent that using multiple team projects in TFS is generally a bad idea. There are of course exceptions to this, but there are a lot ot things that becomes much easier to do when you put all of your projects and team in the same team project. Fellow ALM MVP Martin Hinshelwood has blogged about this several times, as well as other people in the community. In particular, using the backlog and portfolio management tools makes much more sense when everything is located in the same team project. Consolidating multiple team projects into one is not that easy unfortunately, it involves migrating source code, work items, reports etc.  Another thing that also need to be migrated is build definitions. It is possible to clone build definitions within the same team project using the TFS power tools. The Community TFS Build Manager also lets you clone build definitions to other team projects. But there is no tool that allows you to clone/copy a build definition to another collection. So, I whipped up a simple console application that let you do this. The tool can be downloaded from https://onedrive.live.com/redir?resid=EE034C9F620CD58D!8162&authkey=!ACTr56v1QVowzuE&ithint=file%2c.zip   Using CopyTFSBuildDefinitions You use the tool like this: CopyTFSBuildDefinitions  SourceCollectionUrl  SourceTeamProject  BuildDefinitionName  DestinationCollectionUrl  DestinationTeamProject [NewDefinitionName] Arguments SourceCollectionUrl The URL to the TFS collection that contains the team project with the build definition that you want to copy SourceTeamProject The name of the team project that contains the build definition BuildDefinitionName Name of the build definition DestinationCollectionUrl The URL to the TFS collection that contains the team project that you want to copy your build definition to DestinationTeamProject The name of the team project in the destination collection NewDefinitionName (Optional) Use this to override the name of the new build definition. If you don’t specify this, the name will the same as the original one Example: CopyTFSBuildDefinitions  https://jakob.visualstudio.com DemoProject  WebApplication.CI https://anotheraccount.visualstudio.com     Notes Since we are (potentially) create a build definition in a new collection, there is no guarantee that the various paths that are defined in the build definition exist in the new collection. For example, a build definition refers to server paths in TFVC or repos + branches in TFGit. It also refers to build controllers that definitely don’t exist in the new collection. So there will be some cleanup to do after you copy your build definitions. You can fix some of these using the Community TFS Build Manager, for example it is very easy to apply the correct build controller to a set of build definitions The problem stated above also applies to build process templates. However, the tool tries to find a build process template in the new team project with the same file name as the one that existed in the old team project. If it finds one, it will be used for the new build definition. Otherwise is will use the default build template If you want to run the tool for many build definitions, you can use this SQL scripts, compliments of Mr. Scrum/ALM MVP Richard Hundhausen to generate the necessary commands: USE Tfs_Collection GO SELECT 'CopyTFSBuildDefinitions.exe http://SERVER:8080/tfs/collection "' + P.ProjectName + '" "' + REPLACE(BD.DefinitionName,'\','') + '" http://NEWSERVER:8080/tfs/COLLECTION TEAMPROJECT'   FROM tbl_Project P        INNER JOIN tbl_BuildGroup BG on BG.TeamProject = P.ProjectUri        INNER JOIN tbl_BuildDefinition BD on BD.GroupId = BG.GroupId   ORDER BY P.ProjectName, BD.DefinitionName   Hope that helps, let me know if you have any problems with the tool or if you find it useful

    Read the article

  • How do I deal with a third party application that has embedded hints that result in a sub-optimal execution plan in my environment?

    - by Maria Colgan
    I have gotten many variations on this question recently as folks begin to upgrade to Oracle Database 11g and there have been several posts on this blog and on others describing how to use SQL Plan Management (SPM) so that a non-hinted SQL statement can use a plan generated with hints. But what if the hint is supplied in the third party application and is causing performance regressions on your system? You can actually use a very similar technique to the ones shown before but this time capture the un-hinted plan and have the hinted SQL statement use that plan instead. Below is an example that demonstrates the necessary steps. 1. We will begin by running the hinted statement 2. After examining the execution plan we can see it is suboptimal because of a bad join order. 3. In order to use SPM to correct the problem we must create a SQL plan baseline for the statement. In order to create a baseline we will need the SQL_ID for the hinted statement. Easy place to get it is in V$SQL. 4. A SQL plan baseline can be created using a SQL_ID and DBMS_SPM.LOAD_PLANS_FROM_CURSOR_CACHE. This will capture the existing plan for this SQL_ID from the shared pool and store in the SQL plan baseline. 5. We can check the SQL plan baseline got created successfully by querying DBA_SQL_PLAN_BASELINES. 6. When you manually create a SQL plan baseline the first plan added is automatically accepted and enabled. We know that the hinted plan is poorly performing plan so we will disable it using DBMS_SPM.ALTER_SQL_PLAN_BASELINE. Disabling the plan tells the optimizer that this plan not a good plan, however since there is no alternative plan at this point the optimizer will still continue to use this plan until we provide a better one. 7. Now let's run the statement without the hint. 8. Looking at the execution plan we can see that the join order is different. The plan without the hint also has a lower cost (3X lower), which indicates it should perform better. 9. In order to map the un-hinted plan to the hinted SQL statement we need to add the plan to the SQL plan baseline for the hinted statement. We can do this using DBMS_SPM.LOAD_PLANS_FROM_CURSOR_CACHE but we will need the SQL_ID and  PLAN_HASH_VALUE for the non-hinted statement, which we can find in V$SQL. 10. Now we can add the non-hinted plan to the SQL plan baseline of the hinted SQL statement using DBMS_SPM.LOAD_PLANS_FROM_CURSOR_CACHE. This time we need to pass a few more arguments. We will use the SQL_ID and PLAN_HASH_VALUE of the non-hinted statement but the SQL_HANDLE of the hinted statement. 11. The SQL plan baseline for our statement now has two plans. But only the newly added plan (SQL_PLAN_gbpcg3f67pc788a6d8911) is enabled and accepted. This tells the Optimizer that this is the plan it should use for this statement. We can confirm that the correct plan (non-hinted) will be selected for the statement from now on by re-executing the hinted statement and checking its execution plan.

    Read the article

  • A quick look at: sys.dm_os_buffer_descriptors

    - by Jonathan Allen
    SQL Server places data into cache as it reads it from disk so as to speed up future queries. This dmv lets you see how much data is cached at any given time and knowing how this changes over time can help you ensure your servers run smoothly and are adequately resourced to run your systems. This dmv gives the number of cached pages in the buffer pool along with the database id that they relate to: USE [tempdb] GO SELECT COUNT(*) AS cached_pages_count , CASE database_id WHEN 32767 THEN 'ResourceDb' ELSE DB_NAME(database_id) END AS Database_name FROM sys.dm_os_buffer_descriptors GROUP BY DB_NAME(database_id) , database_id ORDER BY cached_pages_count DESC; This gives you results which are quite useful, but if you add a new column with the code: …to convert the pages value to show a MB value then they become more relevant and meaningful. To see how your server reacts to queries, start up SSMS and connect to a test server and database – mine is called AdventureWorks2008. Make sure you start from a know position by running: -- Only run this on a test server otherwise your production server's-- performance may drop off a cliff and your phone will start ringing. DBCC DROPCLEANBUFFERS GO Now we can run a query that would normally turn a DBA’s hair white: USE [AdventureWorks2008] go SELECT * FROM [Sales].[SalesOrderDetail] AS sod INNER JOIN [Sales].[SalesOrderHeader] AS soh ON [sod].[SalesOrderID] = [soh].[SalesOrderID] …and then check our cache situation: A nice low figure – not! Almost 2000 pages of data in cache equating to approximately 15MB. Luckily these tables are quite narrow; if this had been on a table with more columns then this could be even more dramatic. So, let’s make our query more efficient. After resetting the cache with the DROPCLEANBUFFERS and FREEPROCCACHE code above, we’ll only select the columns we want and implement a WHERE predicate to limit the rows to a specific customer. SELECT [sod].[OrderQty] , [sod].[ProductID] , [soh].[OrderDate] , [soh].[CustomerID] FROM [Sales].[SalesOrderDetail] AS sod INNER JOIN [Sales].[SalesOrderHeader] AS soh ON [sod].[SalesOrderID] = [soh].[SalesOrderID] WHERE [soh].[CustomerID] = 29722 …and check our effect cache: Now that is more sympathetic to our server and the other systems sharing its resources. I can hear you asking: “What has this got to do with logging, Jonathan?” Well, a smart DBA will keep an eye on this metric on their servers so they know how their hardware is coping and be ready to investigate anomalies so that no ‘disruptive’ code starts to unsettle things. Capturing this information over a period of time can lead you to build a picture of how a database relies on the cache and how it interacts with other databases. This might allow you to decide on appropriate schedules for over night jobs or otherwise balance the work of your server. You could schedule this job to run with a SQL Agent job and store the data in your DBA’s database by creating a table with: IF OBJECT_ID('CachedPages') IS NOT NULL DROP TABLE CachedPages CREATE TABLE CachedPages ( cached_pages_count INT , MB INT , Database_Name VARCHAR(256) , CollectedOn DATETIME DEFAULT GETDATE() ) …and then filling it with: INSERT INTO [dbo].[CachedPages] ( [cached_pages_count] , [MB] , [Database_Name] ) SELECT COUNT(*) AS cached_pages_count , ( COUNT(*) * 8.0 ) / 1024 AS MB , CASE database_id WHEN 32767 THEN 'ResourceDb' ELSE DB_NAME(database_id) END AS Database_name FROM sys.dm_os_buffer_descriptors GROUP BY database_id After this has been left logging your system metrics for a while you can easily see how your databases use the cache over time and may see some spikes that warrant your attention. This sort of logging can be applied to all sorts of server statistics so that you can gather information that will give you baseline data on how your servers are performing. This means that when you get a problem you can see what statistics are out of their normal range and target you efforts to resolve the issue more rapidly.

    Read the article

  • Bridging the Gap in Cloud, Big Data, and Real-time

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} With all the buzz of around big data and cloud computing, it is easy to overlook one of your most precious commodities—your data. Today’s businesses cannot stand still when it comes to data. Market success now depends on speed, volume, complexity, and keeping pace with the latest data integration breakthroughs. Are you up to speed with big data, cloud integration, real-time analytics? Join us in this three part blog series where we’ll look at each component in more detail. Meet us online on October 24th where we’ll take your questions about what issues you are facing in this brave new world of integration. Let’s start first with Cloud. What happens with your data when you decide to implement a private cloud architecture? Or public cloud? Data integration solutions play a vital role migrating data simply, efficiently, and reliably to the cloud; they are a necessary ingredient of any platform as a service strategy because they support cloud deployments with data-layer application integration between on-premise and cloud environments of all kinds. For private cloud architectures, consolidation of your databases and data stores is an important step to take to be able to receive the full benefits of cloud computing. Private cloud integration requires bidirectional replication between heterogeneous systems to allow you to perform data consolidation without interrupting your business operations. In addition, integrating data requires bulk load and transformation into and out of your private cloud is a crucial step for those companies moving to private cloud. In addition, the need for managing data services as part of SOA/BPM solutions that enable agile application delivery and help build shared data services for organizations. But what about public Cloud? If you have moved your data to a public cloud application, you may also need to connect your on-premise enterprise systems and the cloud environment by moving data in bulk or as real-time transactions across geographies. For public and private cloud architectures both, Oracle offers a complete and extensible set of integration options that span not only data integration but also service and process integration, security, and management. For those companies investing in Oracle Cloud, you can move your data through Oracle SOA Suite using REST APIs to Oracle Messaging Cloud Service —a new service that lets applications deployed in Oracle Cloud securely and reliably communicate over Java Messaging Service . As an example of loading and transforming data into other public clouds, Oracle Data Integrator supports a knowledge module for Salesforce.com—now available on AppExchange. Other third-party knowledge modules are being developed by customers and partners every day. To learn more about how to leverage Oracle’s Data Integration products for Cloud, join us live: Data Integration Breakthroughs Webcast on October 24th 10 AM PST.

    Read the article

  • Silverlight Cream for February 02, 2011 -- #1039

    - by Dave Campbell
    In this Issue: Tony Champion, Gill Cleeren, Alex van Beek, Michael James, Ollie Riches, Peter Kuhn, Mike Ormond, WindowsPhoneGeek(-2-), Daniel N. Egan, Loek Van Den Ouweland, and Paul Thurott. Above the Fold: Silverlight: "Using the AutoCompleteBox" Peter Kuhn WP7: "Windows Phone Image Button" Loek Van Den Ouweland Training: "New WP7 Virtual Labs" Daniel N. Egan Shoutouts: SilverlightShow has their top 5 most popular news articles up: SilverlightShow for Jan 24-30, 2011 Rudi Grobler posted answers he gives to questions about Silverlight - Where do I start? Brian Noyes starts a series of Webinars at SilverlightShow this morning at 10am PDT: Free Silverlight Show Webinar: Querying and Updating Data From Silverlight Clients with WCF RIA Services Join your fellow geeks at Gangplank in Chandler Arizona this Saturday as Scott Cate and AZGroups brings you Azure Boot Camp – Feb 5th 2011 From SilverlightCream.com: Deploying Silverlight with WCF Services Tony Champion takes a step out of his norm (Pivot) and has a post up about deploying WCF Services with your SL app, and how to take the pain out of that without pulling out your hair. Getting ready for Microsoft Silverlight Exam 70-506 (Part 3) Gill Cleeren's part 3 of getting ready for the Silverlight Exam is up at SilverlightShow... with links to the first two parts. There's so much good information linked off these... thanks Gill and 'The Show'! A guide through WCF RIA Services attributes Alex van Beek has a post up you will probably want to bookmark unless you're not using WCF RIA... do you know all the attributes by heart? ... how about an excellent explanation of 10 of them? Using DeferredLoadListBox in a Pivot Control Michael James discusses using the DeferredLoadListBox, and then also using it with the Pivot control... but not without some pain points which he defines and gives the workaround for. WP7: Know your data Ollie Riches' latest is about Data and WP7 ... specifically 'knowing' what data you're needing/using to avoid the 90MB memory limit... He gives a set of steps to follow to measure your data model to avoid getting in trouble. Using the AutoCompleteBox Peter Kuhn takes a great look at the AutoCompleteBox... the basics, and then well beyond with custom data, item templates, custom filters, asynchronous filtering, and a behavior for MVVM async filtering. OData and Windows Phone 7 Part 2 Mike Ormond has part 2 of his OData/WP7 post up... lashing up the images to go along with the code this time out... nice looking app. WP7 RoundToggleButton and RoundButton in depth WindowsPhoneGeek is checking out the RoundToggleButton and RoundButton controls from the Coding4fun Toolkit in detail... of course where to get them, and then the setup, demo project included. All about Dependency Properties in Silverlight for WP7 WindowsPhoneGeek's latest post is a good dependency-property discussion related to WP7 development, but if you're just learning, it's a good place to learn about the subject. New WP7 Virtual Labs Daniel N. Egan posted links to 6 new WP7 Virtual Labs released on 1/25. Windows Phone Image Button Loek Van Den Ouweland has a style up on his blog that gives you an imageButton for your WP7 apps, and a sweet little video showing how it's done in Expression Blend too. Yet another free Windows Phone book for developers Paul Thurott found a link to another Free eBook for WP7 development. This one is by Puja Pramudya and is an English translation of the original, and is an introductory text, but hey... it's free... give it a look! Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Silverlight Cream for December 12, 2010 - 2 -- #1009

    - by Dave Campbell
    In this Issue: Michael Crump, Jesse Liberty, Shawn Wildermuth, Domagoj Pavlešic, Peter Kuhn, James Ashley, Sara Summers, Morten Nielsen, Peter Torr, and Tau Sick. Above the Fold: Silverlight: "Silverlight 4 – Coded UI Framework Video Tutorial" Michael Crump WP7: "Windows Phone From Scratch #12–Custom Behaviors (Part I)" Jesse Liberty From SilverlightCream.com: Silverlight 4 – Coded UI Framework Video Tutorial Michael Crump posted a video tutorial today on the Coded UI Test Framework that we got with the VS2010 Feature Pack 2. Wanna create automated tests? ... check out Michael's video and save yourself some time. Windows Phone From Scratch #12–Custom Behaviors (Part I) Jesse Liberty posted his Windows Phone from Scratch number 12 today... and it's on Custom Behaviors... cool stuff... need to read this and get your head around it... this is part 1, jump on it before he drops part 2 on us! The Next Application Platform? All of them... Shawn Wildermuth has a thought-provoking post up ... check it out and see if you're ready to join him on the adventure of building for all the platforms... Windows Phone 7 Accelerometer Test App Domagoj Pavlešic has a test app up for the accelerometer on the WP7 ... if you need to use it, and are having problems, a good example always helps me. Protocol of developing an animation texture tool Peter Kuhn found a need for a tool to creat some animations for an WP7 XNA game... so he challenged himself to write it, and detailed out all his steps as he went. Re-examining WP7 Launchers and Choosers James Ashley's most recent post is on the Pivot Control ... check this out... add a working Horizontally oriented slider to a pivot... plus some external links to help out New Prototyping Sketch Sheets for WP7 This is one of those posts that I had to go to SilverlightCream and make sure I hadn't hit it yet... pretty cool prototype sheets for WP7 by Sara Summers ... we've seen others, they're all good. Simulating GPS on Windows Phone 7 Morten Nielsen helps you get around the fact that you're not going to be able to use the emulator for testing your GPS app ... at least not without some assistance... and that doesn't mean hauling your dev system around your neighborhood, either. How to correctly handle application deactivation and reactivation We've seen posts on Tombstoning, but probably not from Silverlight team members... check this one out from Peter Torr ... great even sequence information and all the info on how to correctly handle it, plus external links to the documentation... you knew there was documentation, right? :) Localizing a Windows Phone 7 Application Tau Sick has a post up discussing Localization and your WP7 apps... coming from soneone with an app in the marketplace in 3 languages, it's a pretty good bet he's got it figured out! Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Big Data – Data Mining with Hive – What is Hive? – What is HiveQL (HQL)? – Day 15 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the operational database in Big Data Story. In this article we will understand what is Hive and HQL in Big Data Story. Yahoo started working on PIG (we will understand that in the next blog post) for their application deployment on Hadoop. The goal of Yahoo to manage their unstructured data. Similarly Facebook started deploying their warehouse solutions on Hadoop which has resulted in HIVE. The reason for going with HIVE is because the traditional warehousing solutions are getting very expensive. What is HIVE? Hive is a datawarehouseing infrastructure for Hadoop. The primary responsibility is to provide data summarization, query and analysis. It  supports analysis of large datasets stored in Hadoop’s HDFS as well as on the Amazon S3 filesystem. The best part of HIVE is that it supports SQL-Like access to structured data which is known as HiveQL (or HQL) as well as big data analysis with the help of MapReduce. Hive is not built to get a quick response to queries but it it is built for data mining applications. Data mining applications can take from several minutes to several hours to analysis the data and HIVE is primarily used there. HIVE Organization The data are organized in three different formats in HIVE. Tables: They are very similar to RDBMS tables and contains rows and tables. Hive is just layered over the Hadoop File System (HDFS), hence tables are directly mapped to directories of the filesystems. It also supports tables stored in other native file systems. Partitions: Hive tables can have more than one partition. They are mapped to subdirectories and file systems as well. Buckets: In Hive data may be divided into buckets. Buckets are stored as files in partition in the underlying file system. Hive also has metastore which stores all the metadata. It is a relational database containing various information related to Hive Schema (column types, owners, key-value data, statistics etc.). We can use MySQL database over here. What is HiveSQL (HQL)? Hive query language provides the basic SQL like operations. Here are few of the tasks which HQL can do easily. Create and manage tables and partitions Support various Relational, Arithmetic and Logical Operators Evaluate functions Download the contents of a table to a local directory or result of queries to HDFS directory Here is the example of the HQL Query: SELECT upper(name), salesprice FROM sales; SELECT category, count(1) FROM products GROUP BY category; When you look at the above query, you can see they are very similar to SQL like queries. Tomorrow In tomorrow’s blog post we will discuss about very important components of the Big Data Ecosystem – Pig. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Finding the maximum value/date across columns

    - by AtulThakor
    While working on some code recently I discovered a neat little trick to find the maximum value across several columns….. So the starting point was finding the maximum date across several related tables and storing the maximum value against an aggregated record. Here's the sample setup code: USE TEMPDB IF OBJECT_ID('CUSTOMER') IS NOT NULL BEGIN DROP TABLE CUSTOMER END IF OBJECT_ID('ADDRESS') IS NOT NULL BEGIN DROP TABLE ADDRESS END IF OBJECT_ID('ORDERS') IS NOT NULL BEGIN DROP TABLE ORDERS END SELECT 1 AS CUSTOMERID, 'FREDDY KRUEGER' AS NAME, GETDATE() - 10 AS DATEUPDATED INTO CUSTOMER SELECT 100000 AS ADDRESSID, 1 AS CUSTOMERID, '1428 ELM STREET' AS ADDRESS, GETDATE() -5 AS DATEUPDATED INTO ADDRESS SELECT 123456 AS ORDERID, 1 AS CUSTOMERID, GETDATE() + 1 AS DATEUPDATED INTO ORDERS .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; }   Now the code used a function to determine the maximum date, this performed poorly. After considering pivoting the data I opted for a case statement, this seemed reasonable until I discovered other areas which needed to determine the maximum date between 5 or more tables which didn't scale well. The final solution involved using the value clause within a sub query as followed. SELECT C.CUSTOMERID, A.ADDRESSID, (SELECT MAX(DT) FROM (Values(C.DATEUPDATED),(A.DATEUPDATED),(O.DATEUPDATED)) AS VALUE(DT)) FROM CUSTOMER C INNER JOIN ADDRESS A ON C.CUSTOMERID = A.CUSTOMERID INNER JOIN ORDERS O ON O.CUSTOMERID = C.CUSTOMERID .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } As you can see the solution scales well and can take advantage of many of the aggregate functions!

    Read the article

  • WebGrid Helper and Complex Types

    - by imran_ku07
        Introduction:           WebGrid helper makes it very easy to show tabular data. It was originally designed for ASP.NET Web Pages(WebMatrix) to display, edit, page and sort tabular data but you can also use this helper in ASP.NET Web Forms and ASP.NET MVC. When using this helper, sometimes you may run into a problem if you use complex types in this helper. In this article, I will show you how you can use complex types in WebGrid helper.       Description:             Let's say you need to show the employee data and you have the following classes,   public class Employee { public string Name { get; set; } public Address Address { get; set; } public List<string> ContactNumbers { get; set; } } public class Address { public string City { get; set; } }               The Employee class contain a Name, an Address and list of ContactNumbers. You may think that you can easily show City in WebGrid using Address.City, but no. The WebGrid helper will throw an exception at runtime if any Address property is null in the Employee list. Also, you cannot directly show ContactNumbers property. The easiest way to show these properties is to add some additional properties,   public Address NotNullableAddress { get { return Address ?? new Address(); } } public string Contacts { get { return string.Join("; ",ContactNumbers); } }               Now you can easily use these properties in WebGrid. Here is the complete code of this example,  @functions{ public class Employee { public Employee(){ ContactNumbers = new List<string>(); } public string Name { get; set; } public Address Address { get; set; } public List<string> ContactNumbers { get; set; } public Address NotNullableAddress { get { return Address ?? new Address(); } } public string Contacts { get { return string.Join("; ",ContactNumbers); } } } public class Address { public string City { get; set; } } } @{ var myClasses = new List<Employee>{ new Employee { Name="A" , Address = new Address{ City="AA" }, ContactNumbers = new List<string>{"021-216452","9231425651"}}, new Employee { Name="C" , Address = new Address{ City="CC" }}, new Employee { Name="D" , ContactNumbers = new List<string>{"045-14512125","21531212121"}} }; var grid = new WebGrid(source: myClasses); } @grid.GetHtml(columns: grid.Columns( grid.Column("NotNullableAddress.City", header: "City"), grid.Column("Name"), grid.Column("Contacts")))                    Summary:           You can use WebGrid helper to show tabular data in ASP.NET MVC, ASP.NET Web Forms and  ASP.NET Web Pages. Using this helper, you can also show complex types in the grid. In this article, I showed you how you use complex types with WebGrid helper. Hopefully you will enjoy this article too.  

    Read the article

  • Welcome Oracle Data Integration 12c: Simplified, Future-Ready Solutions with Extreme Performance

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The big day for the Oracle Data Integration team has finally arrived! It is my honor to introduce you to Oracle Data Integration 12c. Today we announced the general availability of 12c release for Oracle’s key data integration products: Oracle Data Integrator 12c and Oracle GoldenGate 12c. The new release delivers extreme performance, increase IT productivity, and simplify deployment, while helping IT organizations to keep pace with new data-oriented technology trends including cloud computing, big data analytics, real-time business intelligence. With the 12c release Oracle becomes the new leader in the data integration and replication technologies as no other vendor offers such a complete set of data integration capabilities for pervasive, continuous access to trusted data across Oracle platforms as well as third-party systems and applications. Oracle Data Integration 12c release addresses data-driven organizations’ critical and evolving data integration requirements under 3 key themes: Future-Ready Solutions Extreme Performance Fast Time-to-Value       There are many new features that support these key differentiators for Oracle Data Integrator 12c and for Oracle GoldenGate 12c. In this first 12c blog post, I will highlight only a few:·Future-Ready Solutions to Support Current and Emerging Initiatives: Oracle Data Integration offer robust and reliable solutions for key technology trends including cloud computing, big data analytics, real-time business intelligence and continuous data availability. Via the tight integration with Oracle’s database, middleware, and application offerings Oracle Data Integration will continue to support the new features and capabilities right away as these products evolve and provide advance features. E    Extreme Performance: Both GoldenGate and Data Integrator are known for their high performance. The new release widens the gap even further against competition. Oracle GoldenGate 12c’s Integrated Delivery feature enables higher throughput via a special application programming interface into Oracle Database. As mentioned in the press release, customers already report up to 5X higher performance compared to earlier versions of GoldenGate. Oracle Data Integrator 12c introduces parallelism that significantly increases its performance as well. Fast Time-to-Value via Higher IT Productivity and Simplified Solutions:  Oracle Data Integrator 12c’s new flow-based declarative UI brings superior developer productivity, ease of use, and ultimately fast time to market for end users.  It also gives the ability to seamlessly reuse mapping logic speeds development.Oracle GoldenGate 12c ‘s Integrated Delivery feature automatically optimally tunes the process, saving time while improving performance. This is just a quick glimpse into Oracle Data Integrator 12c and Oracle GoldenGate 12c. On November 12th we will reveal much more about the new release in our video webcast "Introducing 12c for Oracle Data Integration". Our customer and partner speakers, including SolarWorld, BT, Rittman Mead will join us in launching the new release. Please join us at this free event to learn more from our executives about the 12c release, hear our customers’ perspectives on the new features, and ask your questions to our experts in the live Q&A. Also, please continue to follow our blogs, tweets, and Facebook updates as we unveil more about the new features of the latest release. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Do MORE with WebCenter

    - by Michael Snow
    We’ve been extremely busy here on the Oracle WebCenter team. We hope that you’ve all be keeping up with the interesting news each week. Last week was jammed full of GartnerPCC and Gartner360 buzz. If you missed any of the highlights – be sure to check out both Kellsey’s post from last week: Gartner PCC: A Shovel & Some Ah-Ha's and Christie’s overview of Loren Weinberg’s PCC presentation: "Here Today, Gone Tomorrow: Engage Your Customers or Lose Them"  . This week, we’ll be focusing on “Doing More with WebCenter” leading up to a great webcast scheduled for Thursday, March 22 (invite and registration link below). This is the 2nd in a series of 3 webcasts dedicated to expanding the understanding of the full capabilities of WebCenter. Yes – that might mean that you are not getting the full benefits of the software you already own or the expansion potential via upgrade to the full WebCenter Suite Plus. Tune in on Thursday 10 a.m. PT / 1 p.m. ET.  ++++++++++++++ Want to be a Speaker at Oracle OpenWorld 2012? Oracle Open World planning has already kicked off. We know that it is only March and next October is far in the distance. But planning has already started for Oracle OpenWorld 2012. So if you want to be a speaker and propose your own session for this year's event in San Francisco on September 30th - October 4th, starting thinking now!  The annual OpenWorld Call for Papers is now open until April 9th! All of the details to submit a paper are available here. Of course, the WebCenter team here is interested in sessions including case studies, thought-leadership, customer stories around any of the Oracle WebCenter solutions, but the Call for Papers is open to all Oracle topics. When submitting your topic, be sure to describe what you plan to discuss and the value of the presentation to other attendees. Sell your session, because there will be a lot of competition to be selected.  Bonus News: Speakers for selected sessions receive a complimentary full conference pass! Get your papers in and we'll see you in San Francisco! ~~~~~~~~~~~~~~~~~~~~~~ Webcast Series: Do More with Oracle WebCenter - Expand Beyond Content Management Enable Employees, Partners, and Customers to Do More with Your Content Dear [FIRSTNAME] [LASTNAME],-- Did you know that, in addition to content management, Oracle WebCenter now also includes comprehensive portal, composite application, collaboration, and Web experience management capabilities? Join us for this Webcast and learn how you can provide a new level of user engagement. Learn how Oracle WebCenter: Drives task-specific application data and content to a single screen for executing specific business processes Enables mixed internal and external environments where content can be securely shared and filtered with employees, partners, and customers, based upon role-based security Offers Web experience management, driving contextually relevant, social, and interactive online experiences across multiple channels Provides social features that enable sharing, activity feeds, collaboration, expertise location, and best-practices communities Learn how to do more with Oracle WebCenter. Register now for the Webcast. Register Now Join us for the second Webcast in the series "Do More With Oracle WebCenter". March 22, 2012 10 a.m. PT / 1 p.m. ET Presented by: Michelle Huff Senior Director, WebCenter Product Management, Oracle Greg Utecht Project Manager,IT Operations,TIES Copyright © 2012, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices | Privacy Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • NetBeans PHP Community Council

    - by Tomas Mysik
    Hi all, today we would like to inform all of you that now you have a chance to improve NetBeans via NetBeans PHP Community Council. The author of this activity is Timur Poperecinii and he would like to tell you a few words about it. Hello passionate technical people, First of all let me introduce myself: my name is Timur, I’m a developer from Moldova (that little country between Romania and Ukraine), I develop mostly in .NET and JQuery, but I love to learn more, not being an expert I am familiar with Java (Struts2, Play), PHP (Symfony2), Ruby (Rails), Sencha Touch 2 and other technologies. I was “introduced” in PHP recently by a client of mine who requested to make the work specifically in PHP. Let me tell you a little story about my experience with open source and IDEs: when I was studying in university in 2007 I think, I did a simple little application in PHP and thought “Damn, if only there was a good IDE for PHP so I could relax and no having to remember all the function names”, then when I searched on internet pretty much everyone was using Vim or Emacs on Linux, but it had no autocomplete anyway, just syntax highlighting. I remember using some tool like Notepad++ I think. Nowadays everything changed, we have highlighting and autocomplete for about all standard things in PHP in many IDEs. I use NetBeans for PHP, and I really am happy with the experience I have there with standard PHP code, but for frameworks I still think there is lots of room for improvements. For example we have some Symfony 2 and Twig support. But I’d love to see more of that coming, for example I’m a big fan of file templates, where the main goal is to not waste time on writing over and over again something that can be generated, and it counts even more when you don’t have a lot of autocomplete. So what I thought, “Hey I know Java a little, and NetBeans has plugins, so may be it worth trying to do a file templates plugin”, and so I did, you can find details about my Unified Udevi Symfony2 Plugin for NetBeans 7.2 on my blog. It wasn’t hard, and it even was fun! Give back to open source Now think a little, NetBeans is an open source project and PHP support is just a part of it, so the resources are pretty limited in this area. But we as a community that uses this product, want to have the best possible experience with PHP and frameworks(!!!). So why don’t we GIVE BACK TO OPENSOURCE ? Imagine an IDE that can do all the things you wanted + it is free. Now how far is NetBeans from that point? I guess not so far – you might miss a little niche thing that you use on a daily basis, but then the question appears why don’t you make it happen on your own? NetBeans PHP Community Council What I proposed is to create a NetBeans PHP Community Council that will be formed of people willing to change something, willing to create plugins for their own needs and for the needs of the community, test the plugins created by them too, and basically evolve NetBeans in direction they want to reach. I already talked with the NetBeans PHP team. They are only happy to help this Council, with technical advises, opening some APIs we might need to have access to, and other things. One important thing to mention is that this Council is a Community project, so though we’ll have direct discussions with NetBeans PHP Dev team, NetBeans is not the leading force here, it is the community. You can see more details about the goals and structure I proposed at NetBeans PHP Community Council wiki page. We use this mail list: [email protected] for discussions and topics related to the Council. How can I join To join the NetBeans PHP Community Council please send an email to [email protected] with the subject of the mail starting with [Council New Member]. You can subscribe to this mail list here:http://netbeans.org/projects/php/lists. in your mail please indicate your location, age and experience both in Java and PHP. I need these data to assign you to a team. A response will be send to you with your next assignment and some people to contact. I really hope that you’ll make a step forward and try to make your everyday use of NetBeans even more fun.

    Read the article

  • Sneak Peak: Social Developer Program at JavaOne

    - by Mike Stiles
    By guest blogger Roland Smart We're just days away from what is gunning to be the most exciting installment of OpenWorld to date, so how about an exciting sneak peak at the very first Social Developer Program? If your first thought is, "What's a social developer?" you're not alone. It’s an emerging term and one we think will gain prominence as social experiences become more prevalent in enterprise applications. For those who keep an eye on the ever-evolving Facebook platform, you'll recall that they recently rebranded their PDC (preferred developer consultant) group as the PMD (preferred marketing developer), signaling the importance of development resources inside the marketing organization to unlock the potential of social. The marketing developer they're referring to could be considered a social developer in a broader context. While it's true social has really blossomed in the marketing context and CMOs are winning more and more technical resources, social is starting to work its way more deeply into the enterprise with the help of developers that work outside marketing. Developers, like the rest of us, have fallen in "like" with social functionality and are starting to imagine how social can transform enterprise applications in the way it has consumer-facing experiences. The thesis of my presentation is that social developers will take many pages from the marketing playbook as they apply social inside the enterprise. To support this argument, lets walk through a range of enterprise applications and explore how consumer-facing social experiences might be interpreted in this context. Here's one example of how a social experience could be integrated into a sales enablement application. As a marketer, I spend a great deal of time collaborating with my sales colleagues, so I have good insight into their working process. While at Involver, we grew our sales team quickly, and it became evident some of our processes broke with scale. For example, we used to have weekly team meetings at which we'd discuss what was working and what wasn't from a messaging perspective. One aspect of these sessions focused on "objections" and "responses," where the salespeople would walk through common objections to purchasing and share appropriate responses. We tried to map each context to best answers and we'd capture these on a wiki page. As our team grew, however, participation at scale just wasn't tenable, and our wiki pages quickly lost their freshness. Imagine giving salespeople a place where they could submit common objections and responses for their colleagues to see, sort, comment on, and vote on. What you'd get is an up-to-date and relevant repository of information. And, if you supported an application like this with a social graph, it would be possible to make good recommendations to individual sales people about the objections they'd likely hear based on vertical, product, region or other graph data. Taking it even further, you could build in a badging/game element to reward those salespeople who participate the most. Both these examples are based on proven models at work inside consumer-facing applications. If you want to learn about how HR, Operations, Product Development and Customer Support can leverage social experiences, you’re welcome to join us at JavaOne or join our Social Developer Community to find some of the presentations after OpenWorld.

    Read the article

  • EPM 11.1.2.2 Architecture: Financial Performance Management Applications

    - by Marc Schumacher
     Financial Management can be accessed either by a browser based client or by SmartView. Starting from release 11.1.2.2, the Financial Management Windows client does not longer access the Financial Management Consolidation server. All tasks that require an on line connection (e.g. load and extract tasks) can only be done using the web interface. Any client connection initiated by a browser or SmartView is send to the Oracle HTTP server (OHS) first. Based on the path given (e.g. hfmadf, hfmofficeprovider) in the URL, OHS makes a decision to forward this request either to the new Financial Management web application based on the Oracle Application Development Framework (ADF) or to the .NET based application serving SmartView retrievals running on Internet Information Server (IIS). Any requests send to the ADF web interface that need to be processed by the Financial Management application server are send to the IIS using HTTP protocol and will be forwarded further using DCOM to the Financial Management application server. SmartView requests, which are processes by IIS in first row, are forwarded to the Financial Management application server using DCOM as well. The Financial Management Application Server uses OLE DB database connections via native database clients to talk to the Financial Management database schema. Communication between the Financial Management DME Listener, which handles requests from EPMA, and the Financial Management application server is based on DCOM.  Unlike most other components Essbase Analytics Link (EAL) does not have an end user interface. The only user interface is a plug-in for the Essbase Administration Services console, which is used for administration purposes only. End users interact with a Transparent or Replicated Partition that is created in Essbase and populated with data by EAL. The Analytics Link Server deployed on WebLogic communicates through HTTP protocol with the Analytics Link Financial Management Connector that is deployed in IIS on the Financial Management web server. Analytics Link Server interacts with the Data Synchronisation server using the EAL API. The Data Synchronization server acts as a target of a Transparent or Replicated Partition in Essbase and uses a native database client to connect to the Financial Management database. Analytics Link Server uses JDBC to connect to relational repository databases and Essbase JAPI to connect to Essbase.  As most Oracle EPM System products, browser based clients and SmartView can be used to access Planning. The Java based Planning web application is deployed on WebLogic, which is configured behind an Oracle HTTP Server (OHS). Communication between Planning and the Planning RMI Registry Service is done using Java Remote Message Invocation (RMI). Planning uses JDBC to access relational repository databases and talks to Essbase using the CAPI. Be aware of the fact that beside the Planning System database a dedicated database schema is needed for each application that is set up within Planning.  As Planning, Profitability and Cost Management (HPCM) has a pretty simple architecture. Beside the browser based clients and SmartView, a web service consumer can be used as a client too. All clients access the Java based web application deployed on WebLogic through Oracle HHTP Server (OHS). Communication between Profitability and Cost Management and EPMA Web Server is done using HTTP protocol. JDBC is used to access the relational repository databases as well as data sources. Essbase JAPI is utilized to talk to Essbase.  For Strategic Finance, two clients exist, SmartView and a Windows client. While SmartView communicates through the web layer to the Strategic Finance Server, Strategic Finance Windows client makes a direct connection to the Strategic Finance Server using RPC calls. Connections from Strategic Finance Web as well as from Strategic Finance Web Services to the Strategic Finance Server are made using RPC calls too. The Strategic Finance Server uses its own file based data store. JDBC is used to connect to the EPM System Registry from web and application layer.  Disclosure Management has three kinds of clients. While the browser based client and SmartView interact with the Disclosure Management web application directly through Oracle HTTP Server (OHS), Taxonomy Designer does not connect to the Disclosure Management server. Communication to relational repository databases is done via JDBC, to connect to Essbase the Essbase JAPI is utilized.

    Read the article

  • 5 Ways Android Still Disappoints (Me)

    - by TStewartDev
    Let me make this clear: I'm annoyed with Apple. I don't like their current policies and I don't like where Steve Jobs is taking the company. In general, I don't like it when any one company gets too much control in a market. When that happens, the leading company dictates the game and as consumers, our options all but disappear. That said, I'm still going to buy a new iPhone next week. My Apple-hating friends seem to desperately want me to go Android instead, but frankly, it's not good enough for me, and here are the reasons why. The Modern WinMo One of the reasons that Microsoft has identified for Windows Mobile's rapid decline is the breadth of hardware. They exercised little control over manufacturer's implementations. In theory, that sounds great. We as consumers have lots of choice. In practice, though, it meant among other things that updates to the devices were left up to the manufacturers. As a result, that rarely happened. (I'm still bitter at Toshiba for leaving me hanging back in 2002.) And now, Google is doing the same thing with Android. Case in point: my wife has a Motorola Backflip that we bought in April. It was released in March. Motorola says it will get Android 2.1 "sometime in Q3". Great. Meanwhile, I pull down the latest version of iPhone OS (now iOS) and install it the same day it's released. You may say that I can't judge Android by one lazy manufacturer. Yup, I sure can. With Apple, my original iPhone has been supported perfectly for 3 years. With Android, I will have to wait for upgrades after Google releases them, possibly indefinitely. Not cool. AT&T We signed a new contract with AT&T in April to get my wife's phone. I've had a reasonable experience with them. I don't imagine my experience with Verizon would be any better, and I'm relatively confident that Sprint doesn't have the coverage it takes to work well for us. The fact is, AT&T, for whatever reason, doesn't have jack for Android phones. May not be Android's fault, but it's still a shortcoming that prevents me from having it just like the iPhone's exclusivity keeps some folks on other networks from having it. Innovation? What Innovation? Android has a nice dashboard and a great notification system and… nothing else original. I keep reading about how disappointing the iPhone is nowadays. "It has no innovation," people say. Who does? Android has modeled its behavior after the iPhone. That's fine, but if all you've got is a similar product and I'm invested both skill-wise and app-wise in my current platform, why should I change? Microsoft's new Windows Phone 7 looks somewhat innovative, and I'm pretty excited to see what they'll bring to the table, but that's another six months away, at least. I've got a 3 year old phone that has some annoying issues now (thanks to recent encounters with water). I need a new phone now. Is This Going to Work? There's no shortage of criticism of Apple over its App Store policies, and I've vented my own anger about it. However, I will give them credit: their screening of apps has done a great job of weeding out the crap and gives an excellent indication that the app will work on my device. How about Android? Nope. It might work on your phone. Maybe. You'll have to try it to see. Get burned by it? Well, write a nasty review to try to keep others from making the mistake you did. If you don't mind doing that stuff, then Android is the platform for you. Personally, I'd rather have a receptionist screening out the telemarketing and survey calls than hang up on them myself, but that's your call. Slow, Slowing, Slower All this yapping about multitasking. This is an area I've been on Apple's side from the beginning. Sorry folks, but this is the number one reason I hated Windows Mobile: the longer you use it, the slower it gets because it doesn't kill apps. I'm with Steve Jobs on this one: if you see a task manager, we're doing it wrong. I don't want to have to manually kill apps. I hate doing that on Windows let alone on a mobile device. To me, priority one should be keeping the device speedy. Waiting for your device to respond is unacceptable. Bonus! Taken from iPhone Letdown? 8 Things Apple Didn't Announce, here are my responses: 4G Yeah, let me know if your area actually has it. I live in Lincoln, Nebraska. No carrier is going to have 4G here for at least 3 years. Meanwhile, you still get to pay for it. Yay! Cloud iTunes/OTA Sync You got me here. Of course, whether or not your Android device will be able to do it is always a good question. 3G Video Chat You got me here, too. I'm sure you spent countless hours in front of your phone with video chat. Also, I can't wait for the "No Video Chat While Driving" laws. Mobile Hotspot This is a neat feature, but as the author points out, it's left up to the carrier whether to implement it or not. Pretty sure any Android phones that come to AT&T won't have this enabled in the foreseeable future. Is Verizon even allowing this? I just figured Sprint was because they're failing so hard at keeping customers. Free MobileMe I use Google's services with my iPhone. The only people I know who use MobileMe are Apple fanboys and fangirls. If you choose to pay for a service that you can get for free, that's your decision, not Apple's. Voice Input Voice input has been available on phones (even "dumb" phones) for years now. iPhone does have the ability, though limited. Why don't I hear people telling their phones what to do? Maybe because it's still easier to use your fingers than talk to it. Get back to me when this becomes an important feature. Free Navigation Maybe this will be a bigger deal to me now that I'm getting a phone with GPS, but when using my buddy's 3gs, Google maps has worked just fine. Maybe I just don't trust turn-by-turn navigation enough to want it. Dashboard The only legitimate complaint on this list, to me. iPhone's home screen is pathetic, doubly so for the iPad. What a waste of perfectly usable space. I also want to add notifications to this list. Android's notification panel is far superior to the iPhone's. I don't want to hunt all over my screen to find little red dots. Put 'em in one place, Apple.

    Read the article

  • Must go through Windows Boot Loader to get to Grub

    - by Zach
    I just installed a fresh copy of Precise alongside Windows 7. I have to separate 750GB hard drives; /dev/sda holds the Windows partitions and /dev/sdb holds the Ubuntu partitions. Other than that, these are fresh installs of both Windows 7 and Ubuntu 12.04. Whenever I boot, Grub doesn't load, instead it goes to a black screen with a single blinking (horizontal bar) cursor in the top right corner. However, if I boot, hit escape right as the BIOS/POST screen finishes up, see the Windows Boot Loader and hit escape to make it go back to the BIOS screen. After the BIOS screen, grub shows up and everything functions normally; I can boot into Ubuntu or Win7. I don't want to have to do the Escape, Escape, Wait, Boot trick every time. I have no idea what would be wrong or what information I could give you guys to help diagnose. I have run a sudo update-grub and it found everything normally. I tried adding nomodeset flag in the /etc/default/grub line GRUB_CMDLINE_LINUX_DEFAULT which searching around made me think might work. Thoughts on what I could do to fix this? EDIT: I've tried changing the boot order so that both drives in the BIOS (both are labeled as "Internal HDD") have had a try booting first. I think the problem may be that every time I boot, the BIOS boot order is different... and I have to reset it. It seems to not be stable... but I'm not sure how to go about fixing that either. The machine has both traditional BIOS and UEFI. It came standard in "Legacy" mode; so it is currently set to boot through Legacy mode. I've reinstalled Ubuntu now, and now if I hit escape at the end of the BIOS/POST startup screen, it takes me to GRUB menu. Otherwise it automatically loads Windows. It seems like GRUB is now the acting bootloader, it just doesn't automatically start that unless I ask it to open a bootloader. In my other machines, it has always automatically started at the end of BIOS/POST. EDIT2: Using gparted, I just looked at my partitions, it would seem that my linux-swap partition is currently flagged as the boot partition for my Ubuntu install. I currently only have 2 partitions: one of "ext4" with a mount point of "/" and flag " "; and the "linux-swap" with mount point " " and flag "boot." If I change the boot flag to be on "/," it does not reliably solve the problem. After 10 boots: 2 Booted successfully to GRUB 5 Booted directly to Windows 7 3 booted to the black screen with the cursor and hung there Further research makes me think this is an issue of the BIOS not reliably booting hard drives in the same order or not finding both hard drives. If I ask it to create a "boot menu" sometimes it has 2 entries for "Internal HDD," sometimes 1. Also the list it creates changes order every time I bring it up; so it is not following a consistent boot sequence. Will report back if this is not an issue with GRUB.

    Read the article

  • NHibernate and Stored Procedures in C#

    - by Jess Nickson
    I was recently trying and failing to set up NHibernate (v1.2) in an ASP.NET project. The aim was to execute a stored procedure and return the results, but it took several iterations for me to end up with a working solution. In this post I am simply trying to put the required code in one place, in the hope that the snippets may be useful in guiding someone else through the same process. As it is kind’ve the first time I have had to play with NHibernate, there is a good chance that this solution is sub-optimal and, as such, I am open to suggestions on how it could be improved! There are four code snippets that I required: The stored procedure that I wanted to execute The C# class representation of the results of the procedure The XML mapping file that allows NHibernate to map from C# to the procedure and back again The C# code used to run the stored procedure The Stored Procedure The procedure was designed to take a UserId and, from this, go and grab some profile data for that user. Simple, right? We just need to do a join first, because the user’s site ID (the one we have access to) is not the same as the user’s forum ID. CREATE PROCEDURE [dbo].[GetForumProfileDetails] ( @userId INT ) AS BEGIN SELECT Users.UserID, forumUsers.Twitter, forumUsers.Facebook, forumUsers.GooglePlus, forumUsers.LinkedIn, forumUsers.PublicEmailAddress FROM Users INNER JOIN Forum_Users forumUsers ON forumUsers.UserSiteID = Users.UserID WHERE Users.UserID = @userId END I’d like to make a shout out to Format SQL for its help with, well, formatting the above SQL!   The C# Class This is just the class representation of the results we expect to get from the stored procedure. NHibernate requires a virtual property for each column of data, and these properties must be called the same as the column headers. You will also need to ensure that there is a public or protected parameterless constructor. public class ForumProfile : IForumProfile { public virtual int UserID { get; set; } public virtual string Twitter { get; set; } public virtual string Facebook { get; set; } public virtual string GooglePlus { get; set; } public virtual string LinkedIn { get; set; } public virtual string PublicEmailAddress { get; set; } public ForumProfile() { } }   The NHibernate Mapping File This is the XML I wrote in order to make NHibernate a) aware of the stored procedure, and b) aware of the expected results of the procedure. <?xml version="1.0" encoding="utf-8" ?> <hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" namespace="[namespace]" assembly="[assembly]"> <sql-query name="GetForumProfileDetails"> <return-scalar column="UserID" type="Int32"/> <return-scalar column="Twitter" type="String"/> <return-scalar column="Facebook" type="String"/> <return-scalar column="GooglePlus" type="String"/> <return-scalar column="LinkedIn" type="String"/> <return-scalar column="PublicEmailAddress" type="String"/> exec GetForumProfileDetails :UserID </sql-query> </hibernate-mapping>   Calling the Stored Procedure Finally, to bring it all together, the C# code that I used in order to execute the stored procedure! public IForumProfile GetForumUserProfile(IUser user) { return NHibernateHelper .GetCurrentSession() .GetNamedQuery("GetForumProfileDetails") .SetInt32("UserID", user.UserID) .SetResultTransformer( Transformers.AliasToBean(typeof (ForumProfile))) .UniqueResult<ForumProfile>(); } There are a number of ‘Set’ methods (i.e. SetInt32) that allow you specify values for any parameters in the procedure. The AliasToBean method is then required to map the returned scalars (as specified in the XML) to the correct C# class.

    Read the article

  • TechEd 2010 Important Events

    If youll be attending TechEd in New Orleans in a couple of weeks, make sure the following are all on your calendar:   Party with Palermo TechEd 2010 Edition Sunday 6 June 2010 7:30-930pm Central Time RSVP and see who else is coming here.  The party takes place from 730pm to 930pm Central (Local) Time,  and includes a full meal, free swag, and prizes.  The event is being held at Jimmy Buffetts Margaritaville located at 1104 Decatur Street.   Developer Practices Session: DPR304 FAIL: Anti-Patterns and Worst Practices Monday 7 June 2010 4:30pm-545pm Central Time Room 276 Come to my session and hear about what NOT to do on your software project.  Hear my own and others war stories and lessons learned.  Youll laugh, youll cry, youll realize youre a much better developer than a lot of folks out there.  Heres the official description: Everybody likes to talk about best practices, tips, and tricks, but often it is by analyzing failures that we learn from our own and others' mistakes. In this session, Steve describes various anti-patterns and worst practices in software development that he has encountered in his own experience or learned about from other experts in the field, along with advice on recognizing and avoiding them. View DPR304 in TechEd Session Catalog >> Exhibition Hall Reception Monday 7 June 2010 545pm-9pm Immediately following my session, come meet the shows exhibitors, win prizes, and enjoy plenty of food and drink.  Always a good time.   Party: Geekfest Tuesday 8 June 8pm-11pm Central Time, Pat OBriens Lets face it, going to a technical conference is good for your career but its not a whole lot of fun. You need an outlet. You need to have fun. Cheap beer and lousy pizza (with a New Orleans twist) We are bringing back GeekFest! Join us at Pat OBriens for a night of gumbo, beer and hurricanes. There are limited invitations available, so what are you waiting for? If you are attending the TechEd 2010 conference and you are a developer, you are invited. To register pick up your "duck" ticket (and wristband) in the TechEd Technical Learning Center (TLC) at the Developer Tools & Languages (DEV) information desk. You must have wristband to get in. Tuesday, June 8th from 8pm 11pm Pat OBriens New Orleans 624 Bourbon Street New Orleans, LA 70130 Closing Party at Mardi Gras World Thursday 10 June 730pm-10pm Central Time Join us for the Closing Party and enjoy great food, beverages, and the excitement of New Orleans at Mardi Gras World. The colors, the lights, the music, the joie de vivreits all here.  Learn more >> Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Security Access Control With Solaris Virtualization

    - by Thierry Manfe-Oracle
    Numerous Solaris customers consolidate multiple applications or servers on a single platform. The resulting configuration consists of many environments hosted on a single infrastructure and security constraints sometimes exist between these environments. Recently, a customer consolidated many virtual machines belonging to both their Intranet and Extranet on a pair of SPARC Solaris servers interconnected through Infiniband. Virtual Machines were mapped to Solaris Zones and one security constraint was to prevent SSH connections between the Intranet and the Extranet. This case study gives us the opportunity to understand how the Oracle Solaris Network Virtualization Technology —a.k.a. Project Crossbow— can be used to control outbound traffic from Solaris Zones. Solaris Zones from both the Intranet and Extranet use an Infiniband network to access a ZFS Storage Appliance that exports NFS shares. Solaris global zones on both SPARC servers mount iSCSI LU exported by the Storage Appliance.  Non-global zones are installed on these iSCSI LU. With no security hardening, if an Extranet zone gets compromised, the attacker could try to use the Storage Appliance as a gateway to the Intranet zones, or even worse, to the global zones as all the zones are reachable from this node. One solution consists in using Solaris Network Virtualization Technology to stop outbound SSH traffic from the Solaris Zones. The virtualized network stack provides per-network link flows. A flow classifies network traffic on a specific link. As an example, on the network link used by a Solaris Zone to connect to the Infiniband, a flow can be created for TCP traffic on port 22, thereby a flow for the ssh traffic. A bandwidth can be specified for that flow and, if set to zero, the traffic is blocked. Last but not least, flows are created from the global zone, which means that even with root privileges in a Solaris zone an attacker cannot disable or delete a flow. With the flow approach, the outbound traffic of a Solaris zone is controlled from outside the zone. Schema 1 describes the new network setting once the security has been put in place. Here are the instructions to create a Crossbow flow as used in Schema 1 : (GZ)# zoneadm -z zonename halt ...halts the Solaris Zone. (GZ)# flowadm add-flow -l iblink -a transport=TCP,remote_port=22 -p maxbw=0 sshFilter  ...creates a flow on the IB partition "iblink" used by the zone to connect to the Infiniband.  This IB partition can be identified by intersecting the output of the commands 'zonecfg -z zonename info net' and 'dladm show-part'.  The flow is created on port 22, for the TCP traffic with a zero maximum bandwidth.  The name given to the flow is "sshFilter". (GZ)# zoneadm -z zonename boot  ...restarts the Solaris zone now that the flow is in place.Solaris Zones and Solaris Network Virtualization enable SSH access control on Infiniband (and on Ethernet) without the extra cost of a firewall. With this approach, no change is required on the Infiniband switch. All the security enforcements are put in place at the Solaris level, minimizing the impact on the overall infrastructure. The Crossbow flows come in addition to many other security controls available with Oracle Solaris such as IPFilter and Role Based Access Control, and that can be used to tackle security challenges.

    Read the article

  • TechEd 2010 Important Events

    If youll be attending TechEd in New Orleans in a couple of weeks, make sure the following are all on your calendar:   Party with Palermo TechEd 2010 Edition Sunday 6 June 2010 7:30-930pm Central Time RSVP and see who else is coming here.  The party takes place from 730pm to 930pm Central (Local) Time,  and includes a full meal, free swag, and prizes.  The event is being held at Jimmy Buffetts Margaritaville located at 1104 Decatur Street.   Developer Practices Session: DPR304 FAIL: Anti-Patterns and Worst Practices Monday 7 June 2010 4:30pm-545pm Central Time Room 276 Come to my session and hear about what NOT to do on your software project.  Hear my own and others war stories and lessons learned.  Youll laugh, youll cry, youll realize youre a much better developer than a lot of folks out there.  Heres the official description: Everybody likes to talk about best practices, tips, and tricks, but often it is by analyzing failures that we learn from our own and others' mistakes. In this session, Steve describes various anti-patterns and worst practices in software development that he has encountered in his own experience or learned about from other experts in the field, along with advice on recognizing and avoiding them. View DPR304 in TechEd Session Catalog >> Exhibition Hall Reception Monday 7 June 2010 545pm-9pm Immediately following my session, come meet the shows exhibitors, win prizes, and enjoy plenty of food and drink.  Always a good time.   Party: Geekfest Tuesday 8 June 8pm-11pm Central Time, Pat OBriens Lets face it, going to a technical conference is good for your career but its not a whole lot of fun. You need an outlet. You need to have fun. Cheap beer and lousy pizza (with a New Orleans twist) We are bringing back GeekFest! Join us at Pat OBriens for a night of gumbo, beer and hurricanes. There are limited invitations available, so what are you waiting for? If you are attending the TechEd 2010 conference and you are a developer, you are invited. To register pick up your "duck" ticket (and wristband) in the TechEd Technical Learning Center (TLC) at the Developer Tools & Languages (DEV) information desk. You must have wristband to get in. Tuesday, June 8th from 8pm 11pm Pat OBriens New Orleans 624 Bourbon Street New Orleans, LA 70130 Closing Party at Mardi Gras World Thursday 10 June 730pm-10pm Central Time Join us for the Closing Party and enjoy great food, beverages, and the excitement of New Orleans at Mardi Gras World. The colors, the lights, the music, the joie de vivreits all here.  Learn more >> Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • PARTNER WEBCAST SERIES: INNOVATIONS IN APPLICATIONS - PROGRAM

    - by mseika
    Dear Partner, We are pleased to invite you to join the Innovations in Applications webcast series. Innovations in Applications will present Oracle Product's new functions and features including sales positioning. The key objectives of these webcasts are to inspire Partner's personnel to conduct successful sales, after sales and delivery at their Customer. Moreover, we aim to inspire you to conduct further Product Training and Certifications. And finally we'll provide you a chance to join Ecosystem's Product specific Community to learn and to contribute. Innovations in Applications will be presented as per the schedule below after the billable day (4:00 to 5:00 PM CET). The webcast is intended for Partner's Implementation Certified Specialists but Innovations in Applications is open for other Partner's personnel as well. At first, Oracle representative will discuss Oracle's contribution to partners. Then you will see product breakout session followed by Q&A with Oracle Experts. Each session will last for maximum 1 hour. A Q&A document covering all questions and answers will be made available after the webcast. What are the Benefits for partners? Find out how Innovations in Applications helps you to improve your sales, after sales and delivery Discover new functions and features so you can enrich your Customers's solution Learn more about Oracle products, especially sales positioning Hear crucial questions raised by colleague alike, learn from their interest Engage and present your questions to subject experts Be inspired of the richness of Oracle's product portfolio – for your and your customer's benefit. Be inspired to seek further Product Training and Certifications - Make your competence known and recognized! Brand yourself! Note: Should you already be familiar with a specific Product, then choose another one. Doing so you would expand your knowledge of the overall product portfolio. Some presentations contain product demonstration, although these presentations are not intended to be extremely detailed technical presentations. Useful Links for you to bookmark: To access previously presented Products presentations and Public Sector Value Proposition presentations, please go to the Recordings tab. You might want to bookmark the Enablement blog page Oracle Partner Enablement. Please check this regularly as we publish lots of good content here just for you. You might want to bookmark the Knowledge Zones page for solution-focused pages designed to jump start your path towards Specialization. You might want to bookmark the global event calendar page events.oracle.com. Delivery Format Innovations in Applications – program is a series of FREE prerecorded Oracle product presentations followed by Q&A. It will be delivered over the Web. Participants have the opportunity to submit questions during the web cast via chat and subject matter experts will provide verbal answers live. Innovations in Applications consists of several parallel prerecorded product breakout sessions, each lasting for max. 1 hour. At first, Oracle representative will discuss Oracle's contribution to Partners. Then you'll see the product breakout sessions followed by Q&A with Oracle Experts. A Q&A document covering all questions and answers will be made available after the webcast. You can also see Innovations in Applications afterwards as its content will be available online for the next 6-12 months. The next Innovations in Applications webcasts will be presented as follows: July 1st 2013 (please see Next Webcast tab) For more information please click here. Note: Depending on local network bandwidth please allow some seconds time the presentations to download. You might want to refresh your screen by pressing F5. DurationMaximum 1 hour For further information please contact Markku Rouhiainen.

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >