Search Results

Search found 12530 results on 502 pages for 'icc cricket world cup 2011'.

Page 47/502 | < Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >

  • Autoscaling in a modern world&hellip;. Part 1

    - by Steve Loethen
    It has been a while since I have had time to sit down and blog.  I need to make sure I take the time.  It helps me to focus on technology and not let the administrivia keep me from doing the things I love. I have been focusing on the cloud for the last couple of years.  Specifically the  PaaS platform from Microsoft called Azure.  Time to dig in.. I wanted to explore Autoscaling.  Autoscaling is not native part of Azure.  The platform has the needed connection points.  You can write code that looks at the health and performance of your application components and react to needed scaling changes.  But that means you have to write all the code.  Luckily, an add on to the Enterprise Library provides a lot of code that gets you a long way to being able to autoscale without having to start from scratch. The tool set is primarily composed of a Autoscaler object that you need to host.  This object, when hosted and configured, looks at the performance criteria you specify and adjusts your application based on your needs.  Sounds perfect. I started with the a set of HOL’s that gave me a good basis to understand the mechanics.  I worked through labs 1 and 2 just to get the feel, but let’s start our saga at the end of lab3.  Lab3 end results in a web application, hosted in Azure and a console app running on premise.  The web app has a few buttons on it.  One set adds messages to a queue, another removes them.  A second set of buttons drives processor utilization to 100%.  If you want to guess, a safe bet is that the Autoscaler is configured to react to a queue that has filled up or high cpu usage.  We will continue our saga in the next post…

    Read the article

  • Isometric screen to 3D world coordinates efficiently

    - by Justin
    Been having a difficult time transforming 2D screen coordinates to 3D isometric space. This is the situation where I am working in 3D but I have an orthographic camera. Then my camera is positioned at (100, 200, 100), Where the xz plane is flat and y is up and down. I've been able to get a sort of working solution, but I feel like there must be a better way. Here's what I'm doing: With my camera at (0, 1, 0) I can translate my screen coordinates directly to 3D coordinates by doing: mouse2D.z = (( event.clientX / window.innerWidth ) * 2 - 1) * -(window.innerWidth /2); mouse2D.x = (( event.clientY / window.innerHeight) * 2 + 1) * -(window.innerHeight); mouse2D.y = 0; Everything okay so far. Now when I change my camera back to (100, 200, 100) my 3D space has been rotated 45 degrees around the y axis and then rotated about 54 degrees around a vector Q that runs along the xz plane at a 45 degree angle between the positive z axis and the negative x axis. So what I do to find the point is first rotate my point by 45 degrees using a matrix around the y axis. Now I'm close. So then I rotate my point around the vector Q. But my point is closer to the origin than it should be, since the Y value is not 0 anymore. What I want is that after the rotation my Y value is 0. So now I exchange my X and Z coordinates of my rotated vector with the X and Z coordinates of my non-rotated vector. So basically I have my old vector but it's y value is at an appropriate rotated amount. Now I use another matrix to rotate my point around the vector Q in the opposite direction, and I end up with the point where I clicked. Is there a better way? I feel like I must be missing something. Also my method isn't completely accurate. I feel like it's within 5-10 coordinates of where I click, maybe because of rounding from many calculations. Sorry for such a long question.

    Read the article

  • Autoscaling in a modern world&hellip;. Part 4

    - by Steve Loethen
    Now that I have the rules and services XML files in the cloud, it is time to sever the bounds of earth and live totally in the cloud.  I have to host the Autoscaling object in Azure as well, point it to the rules, tell it the management certs and get out of the way. A couple of questions.  Where to host?  The most obvious place to me was a worker role.  A simple, single purpose worker role, doing nothing but watching my app.  Here are the steps I used. 1) Created a project.  Separate project from my web site.  I wanted to be able to run the web in the cloud and the autoscaler local for debugging purposes.  Seemed like the easiest way.  2) Add the Wasabi block to the project. 3) Configure the settings.  I used the same settings used for the console app.  It points to the same web role, uses the same rules file.  4) Make sure the certification needed to manage the role is added to the cert store in the sky (“LocalMachine” and “My” are default locations). I ran the worker role in the local fabric.  It worked.  I then published to the cloud, and verified it worked again.  Here is what my code looked like. public override bool OnStart() { Trace.WriteLine("Set Default Connection Limit", "Information"); // Set the maximum number of concurrent connections ServicePointManager.DefaultConnectionLimit = 12; Trace.WriteLine("Set up configuration change code", "Information"); // set up config CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) => configSetter(RoleEnvironment.GetConfigurationSettingValue(configName))); Trace.WriteLine("Get current diagnostic configuration", "Information"); // Get current diagnostic configuration DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration(); Trace.WriteLine("Set Diagnostic Buffer Size", "Information"); // Set Diagnostic Buffer size dmc.Logs.BufferQuotaInMB = 4; Trace.WriteLine("Set log transfer period", "Information"); // Set log transfer period dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1); Trace.WriteLine("Set log verbosity", "Information"); // Set log filter to verbose dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose; Trace.WriteLine("Start the diagnostic monitor", "Information"); // Start the diagnostic monitor DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", dmc); Trace.WriteLine("Get the current Autoscaler from the EntLib Container", "Information"); // Get the current Autoscaler from the EntLib Container scaler = EnterpriseLibraryContainer.Current.GetInstance<Autoscaler>(); Trace.WriteLine("Start the autoscaler", "Information"); // Start the autoscaler scaler.Start(); Trace.WriteLine("call the base class OnStart", "Information"); // call the base class OnStart return base.OnStart(); } public override void OnStop() { Trace.WriteLine("Stop the Autoscaler", "Information"); // Stop the Autoscaler scaler.Stop(); } I did have to turn on some basic logging for wasabi, which will cover in the next post.  This let me figure out that I hadn’t done the certificate step.

    Read the article

  • Autoscaling in a modern world&hellip;. last chapter

    - by Steve Loethen
    As we all know as coders, things like logging are never important.  Our code will work right the first time.  So, you can understand my surprise when the first time I deployed the autoscaling worker role to the actual Azure fabric, it did not scale.  I mean, it worked on my machine.  How dare the datacenter argue with that.  So, how did I track down the problem?  (turns out, it was not so much code as lack of the right certificate)  When I ran it local in the developer fabric, I was able to see a wealth of information.  Lots of periodic status info every time the autoscalar came around to check on my rules and decide to act or not.  But that information was not making it to Azure storage.  The diagnostics were not being transferred to where I could easily see and use them to track down why things were not being cooperative.  After a bit of digging, I discover the problem.  You need to add a bit of extra configuration code to get the correct information stored for you.  I added the following to my app.config: Code Snippet <system.diagnostics>     <sources>         <source name="Autoscaling General"switchName="SourceSwitch"           switchType="System.Diagnostics.SourceSwitch" >         <listeners>           <add name="AzureDiag" />             <remove name="Default"/>         </listeners>       </source>         <source name="Autoscaling Updates"switchName="SourceSwitch"           switchType="System.Diagnostics.SourceSwitch" >         <listeners>           <add name="AzureDiag" />             <remove name="Default"/>         </listeners>       </source>     </sources>     <switches>       <add name="SourceSwitch"           value="Verbose, Information, Warning, Error, Critical" />     </switches>     <sharedListeners>       <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener,Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral,PublicKeyToken=31bf3856ad364e35" name="AzureDiag"/>     </sharedListeners>     <trace>       <listeners>         <add             type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener,Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral,PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics">           <filter type="" />         </add>       </listeners>     </trace>   </system.diagnostics> Suddenly all the rich tracing info I needed was filling up my storage account.  After a few cycles of trying to attempting to scale, I identified the cert problem, uploaded a correct certificate, and away it went.  I hope this was helpful.

    Read the article

  • Oracle ADF 11g - Einladung zu den News Online Sessions - n&auml;chster Termin: 18. M&auml;rz 2011

    - by heidrun.walther
    Was ist ADF? ADF steht für Oracle Application Develoment Framework. ADF setzt die JEE Standards um und erweitert deren Funktionalität insbesondere im Hinblick auf die Vielzahl der zur Verfügung gestellten Komponenten (insbesondere im Hinblick auf die Visualisierung) und im Bereich der Ablaufsteuerung (Taskflows ersetzen Pageflows). ADF ist einer der Bausteine, auf denen die Entwicklung aller neuen Oracle Anwendungssysteme beruht (inkl. Vertical Solutions und der Administrationswerkzeuge). Das verwendete Entwicklungswerkzeug ist der Oracle JDeveloper. Rapid Application Development (RAD) wird durch eine deklarative, Metadaten getriebene Entwicklung ermöglicht, die auf allen Ebenen in starkem Maße mit Templating (also der Möglichkeit, mit vorgegebenen Mustern zu arbeiten) und mit Wiederverwendbarkeit arbeitet. Entwicklung und Dokumentation erfolgen in einem Schritt. ADF arbeitet nahtlos mit den anderen Oracle SOA Werkzeugen zusammen und bringt ein Rollen-/ Policy getriebenes Zugriffssystem mit. Es ist in das Oracle Identity Management integrierbar. ADF News Online Sessions? Die ADF News Online Sessions geben Tipps von Anwendern/Entscheidern für Anwender/Entscheider und bieten einen Ideenaustausch für den Einsatz von ADF bzw. für die Umsetzung von ADF Projekten. Die jeweiligen  Referenten sind Mitarbeiter von Oracle Partnerunternehmen und Oracle ADF-Spezialisten. Hier die Inhalte derVierte News-Staffel: 18.02.11 - Managing Migrationsprojekte: Forms - ADF / Erfahrungsbericht 04.03.11 - Using Groovy in Oracle ADF Business Components (english) 18.03.11 - Taskflow orientierte Entwicklung mit UI Shell 01.04.11 - erste Konzept, Überblick, Integration Desktop ADF 15.04.11 - ADF Best Practice: ADF BC Strukturierung 29.04.11 - Anpassung von ADF Anwendungen zur Laufzeit (Endanwender) mit Oracle WebCenter Sie erhalten die Einwahldaten für die jeweilige Session, wenn Sie sich entweder in den Mailverteiler aufnehmen lassen (Mail an [email protected]) oder über die ADF Community Seiten auf XING, indem Sie sich für die betreffende Session anmelden. Oracle ADF Community? Die Oracle ADF Community setzt sich das Ziel, Informationen und Erfahrungen zu Oracle ADF auszutauschen und damit die Entwicklungs-Plattform Oracle Application Development Framework (ADF) unter Entwicklern, Anwendern und IT-Dienstleistern bekannter zu machen. Sie sind herzlich eingeladen, sich aktiv daran zu beteiligen. Mehr unter ADF Community Gruppe auf Xing

    Read the article

  • Ubuntu, OpenSuse, the world of linux for a web-developer

    - by SonofWatson
    I'm learning web development. My main OS is windows 7 but I've used Linux and currently dual-booting with Ubuntu. My Linux knowledge however, is pretty limited. I can work with the command line on simple tasks but that's pretty much it. I don't do any shell scripting, don't know very well the most important commands, nor the system in general. I am interested in web development. Should I get myself familiarized more with Linux ? Is it a must for future job positions considering my field of interest?

    Read the article

  • Real-world use cases for Smalltalk

    - by Andrea Spadaccini
    Hello, I've been playing a bit with Smalltalk, and I found it interesting. I know that there are some classical examples of Smalltalk: the Smalltalk images themselves and the Seaside web framework, and that there are lots of in-house custom applications built using this language. I'd like to know if: there are computer applications actively used and developed apart from the ones I mentioned. there are software houses that use Smalltalk for doing their job when would you use Smalltalk instead of another language for developing from scratch a new application

    Read the article

  • Computer Science Degrees and Real-World Experience

    - by Steven Elliott Jr
    Recently, at a family reunion-type event I was asked by a high school student how important it is to get a computer science degree in order to get a job as a programmer in lieu of actual programming experience. The kid has been working with Python and the Blender project as he's into making games and the like; it sounds like he has some decent programming chops. Now, as someone that has gone through a computer science degree my initial response to this question is to say, "You absolutely MUST get a computer science degree in order to get a job as a programmer!" However, as I thought about this I was unsure as to whether my initial reaction was due in part to my own suffering as a CS student or because I feel that this is actually the case. Now, for me, I can say that I rarely use anything that I learned in college, in terms of the extremely hard math, algorithms, etc, etc. but I did come away with a decent attitude and the willingness to work through tough problems. I just don't know what to tell this kid; I feel like I should tell him to do the CS degree but I have hired so many programmers that majored in things like English, Philosophy, and other liberal arts-type degrees, even some that never went to college. In fact my best developer, falls into this latter category. He got started writing software for his church or something and then it took off into a passion. So, while I know this is one of those juicy potential down vote questions, I am just curious as to what everyone else thinks about this topic. Would you tell a high school kid about this? Perhaps if he/she already knows a good deal of programming and loves it he doesn't need a CS degree and could expand his horizons with a liberal arts degree. I know one of the creators of the Django web framework was a American Literature major and he is obviously a pretty gifted developer. Anyway, thanks for the consideration.

    Read the article

  • SPARC T4-2 Produces World Record Oracle Essbase Aggregate Storage Benchmark Result

    - by Brian
    Significance of Results Oracle's SPARC T4-2 server configured with a Sun Storage F5100 Flash Array and running Oracle Solaris 10 with Oracle Database 11g has achieved exceptional performance for the Oracle Essbase Aggregate Storage Option benchmark. The benchmark has upwards of 1 billion records, 15 dimensions and millions of members. Oracle Essbase is a multi-dimensional online analytical processing (OLAP) server and is well-suited to work well with SPARC T4 servers. The SPARC T4-2 server (2 cpus) running Oracle Essbase 11.1.2.2.100 outperformed the previous published results on Oracle's SPARC Enterprise M5000 server (4 cpus) with Oracle Essbase 11.1.1.3 on Oracle Solaris 10 by 80%, 32% and 2x performance improvement on Data Loading, Default Aggregation and Usage Based Aggregation, respectively. The SPARC T4-2 server with Sun Storage F5100 Flash Array and Oracle Essbase running on Oracle Solaris 10 achieves sub-second query response times for 20,000 users in a 15 dimension database. The SPARC T4-2 server configured with Oracle Essbase was able to aggregate and store values in the database for a 15 dimension cube in 398 minutes with 16 threads and in 484 minutes with 8 threads. The Sun Storage F5100 Flash Array provides more than a 20% improvement out-of-the-box compared to a mid-size fiber channel disk array for default aggregation and user-based aggregation. The Sun Storage F5100 Flash Array with Oracle Essbase provides the best combination for large Oracle Essbase databases leveraging Oracle Solaris ZFS and taking advantage of high bandwidth for faster load and aggregation. Oracle Fusion Middleware provides a family of complete, integrated, hot pluggable and best-of-breed products known for enabling enterprise customers to create and run agile and intelligent business applications. Oracle Essbase's performance demonstrates why so many customers rely on Oracle Fusion Middleware as their foundation for innovation. Performance Landscape System Data Size(millions of items) Database Load(minutes) Default Aggregation(minutes) Usage Based Aggregation(minutes) SPARC T4-2, 2 x SPARC T4 2.85 GHz 1000 149 398* 55 Sun M5000, 4 x SPARC64 VII 2.53 GHz 1000 269 526 115 Sun M5000, 4 x SPARC64 VII 2.4 GHz 400 120 448 18 * – 398 mins with CALCPARALLEL set to 16; 484 mins with CALCPARALLEL threads set to 8 Configuration Summary Hardware Configuration: 1 x SPARC T4-2 2 x 2.85 GHz SPARC T4 processors 128 GB memory 2 x 300 GB 10000 RPM SAS internal disks Storage Configuration: 1 x Sun Storage F5100 Flash Array 40 x 24 GB flash modules SAS HBA with 2 SAS channels Data Storage Scheme Striped - RAID 0 Oracle Solaris ZFS Software Configuration: Oracle Solaris 10 8/11 Installer V 11.1.2.2.100 Oracle Essbase Client v 11.1.2.2.100 Oracle Essbase v 11.1.2.2.100 Oracle Essbase Administration services 64-bit Oracle Database 11g Release 2 (11.2.0.3) HP's Mercury Interactive QuickTest Professional 9.5.0 Benchmark Description The objective of the Oracle Essbase Aggregate Storage Option benchmark is to showcase the ability of Oracle Essbase to scale in terms of user population and data volume for large enterprise deployments. Typical administrative and end-user operations for OLAP applications were simulated to produce benchmark results. The benchmark test results include: Database Load: Time elapsed to build a database including outline and data load. Default Aggregation: Time elapsed to build aggregation. User Based Aggregation: Time elapsed of the aggregate views proposed as a result of tracked retrieval queries. Summary of the data used for this benchmark: 40 flat files, each of size 1.2 GB, 49.4 GB in total 10 million rows per file, 1 billion rows total 28 columns of data per row Database outline has 15 dimensions (five of them are attribute dimensions) Customer dimension has 13.3 million members 3 rule files Key Points and Best Practices The Sun Storage F5100 Flash Array has been used to accelerate the application performance. Setting data load threads (DLTHREADSPREPARE) to 64 and Load Buffer to 6 improved dataloading by about 9%. Factors influencing aggregation materialization performance are "Aggregate Storage Cache" and "Number of Threads" (CALCPARALLEL) for parallel view materialization. The optimal values for this workload on the SPARC T4-2 server were: Aggregate Storage Cache: 32 GB CALCPARALLEL: 16   See Also Oracle Essbase Aggregate Storage Option Benchmark on Oracle's SPARC T4-2 Server oracle.com Oracle Essbase oracle.com OTN SPARC T4-2 Server oracle.com OTN Oracle Solaris oracle.com OTN Oracle Database 11g Release 2 Enterprise Edition oracle.com OTN Disclosure Statement Copyright 2012, Oracle and/or its affiliates. All rights reserved. Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners. Results as of 28 August 2012.

    Read the article

  • Open World Day 3

    - by Antony Reynolds
    A Day in the Life of an Oracle OpenWorld Attendee Part IV My third day was exhibition day for me!  I took the opportunity to wander around the JavaOne and OpenWorld exhibitions to see what might be useful for me when selling WebLogic, Coherence & SOA Suite.  I found a number of interesting vendors and thought I would share what I found here.  These are not necessarily endorsements, but observations on companies that I thought had interesting looking products that fill a need I have seen at customers. Highly Available EBS Upgrades A few years ago I worked with a customer that was a port authority.  They wanted to tie E-Business Suite into their operations to provide faster processing of cargo and passengers.  However they only had a 2 hour downtime window to perform upgrades.  This was not a problem for core database and middleware technology, this could accommodate those upgrade timescales easily.  It was a problem for EBS however so I intrigued to find Rapid E-Suite Inc offering an 11i to 12i upgrade service that claims to require no outage.  This could be a real boon to EBS customers like my port friends that need to upgrade without disruption to their business. Mobile on WebLogic I have come across a number of customers who want a comprehensive mobile solution, connected and disconnected operation and so forth.  ADF only addresses part of these requirements currently so I was excited to discover mFrontiers Inc offering an apparently comprehensive solution that should integrate easily with Oracle SOA Suite to mobile enable a SOA infrastructure.  The ability to operate without a network is important for many applications, particularly in industries that require their engineers to enter buildings to perform maintenance or repairs, because network access is not always available – many of my colleagues don’t have mobile access from their homes because they live in the middle of nowhere – and disconnected support is crucial in these situations. Sharepoint Connector for WebCenter Content Obviously Sharepoint is an evil pernicious intrusion into a companies IT estate but it is widely deployed and many people like it but also would like to take advantage of Oracle products such as WebCenter Content.  So I was encouraged to see that Fishbowl Solutions have created a connector for Sharepoint that allows it to bring in content from WebCenter, it looks like a valuable way to maintain the Sharepoint interface end users are used to but extend the range of content by pulling stuff (technical term for content) from WebCenter.   Load Balancing The Enterprise Deployment Guides are Oracles bible on building highly available FMW environments, and each of them requires a front end load balancer.  I have been asked to help configure F5 Load Balancers on a number of occasions over my time at Oracle and each time I come back to it I find more useful features have been added to the BigIP line of load balancers that F5 sell, many of their documents are tailored to FMW.  I like F5, they provide (relatively) easy to use products that do what they say on the side of the box.  They may not have all the bells and whistles of some of their more expensive competitors but they do the job and do it well!  Besides which I like their logo! Other Stuff I saw lots of other interesting products and services, such as a lightweight monitoring tool for Coherence, Forms migration services, JCAPS migration services and lots of cool freebies to take home to the children! A Quiet Night Wednesday night was the partner appreciation event and I had decided to go back to the hotel and have an early night.  I decided to attend the last session of the day – a Maven/Hudson/WebLogic tutorial.  I got the wrong hotel for the session and snuck in 20 minutes late at the back and starting working on the hands on workshop.  One of my co-attendees raised his hand for help and as the presenter came over to help he suddenly stopped and yelled – “Is that Antony”!  It was my old friend Steve Button who used to be based in Redwood Shores but is now a WebLogic guru PM in Australia.  It was good to catch up with him.  As he yelled out a guy with really bad posture turned around to see who he was talking to, this turned out to be my friend Simon Haslan, Oracle ACE from the UK.  After the tutorial Simon and I retired to the coffee shop to catch up and share stories.  2 and half hours later we decided it was time to retire, so much for an early night but great to renew old friendships and find out what real customers are worrying about.

    Read the article

  • Google, Facebook et Yahoo testeront l'IPv6 en grandeur nature lors de la Journée Mondiale du protocole en juin 2011

    Mise à jour du 12.11.2010 par Katleen La transition de l'IPv4 à l'IPv6 pourrait rendre l'Internet instable pour une période pouvant atteindre plusieurs années, d'après Vint Cerf Vint Cerf, un pionnier en informatique que certains considèrent comme le père du Net s'est exprimé cette semaine concernant le changement de protocole des URLs. Ce changement, nécéssaire pour que le web puisse poursuivre son expansion, l'inquiète. Il en veut également aux entreprises, qui "ne voient pas assez loin" et qui n'ont pas prévu la migration plus tôt. En effet, il va falloir passer de l'IPv4 (qui sera vraisemblablement saturé en janvier 2012) à l'IPv6. Et la transition p...

    Read the article

  • Missed OpenWorld 2011 or JavaOne? See the Key Announcements Today

    - by Dain C. Hansen
    Learn more about Oracle OpenWorld and JavaOne Key Announcements through our six On Demand Webcasts or Podcasts. Your time is precious and you can't make time to watch all keynotes and sessions on demand. Want to get a concise overview on the Oracle OpenWorld and JavaOne key announcements? Presented by Oracle experts in EMEA, these six webcasts will help you decide which keynotes, general or solution sessions on Oracle OpenWorld and JavaOne could be of more interest to you. Six informative, on-demand sessions are available as podcasts and webcasts, on Oracle Hardware and Software, each taking just 15-20 minutes. Be updated in an hour on Oracle OpenWorld on… Oracle Exadata and Exalogic Engineered Systems with Oracle Applications Oracle Exalytics Business Intelligence Machine, the industry's first in-memory hardware and software system Oracle Big Data Appliance, the end-to-end solution for Big Data Oracle Enterprise Manager 12c, the industry's first solution to combine management of the full Oracle stack with complete enterprise cloud lifecycle management Oracle Fusion Applications, a complete suite with 100+ modules Oracle Public Cloud with subscription-based, self-service access to Oracle Fusion Applications, Oracle Fusion Middleware and Oracle Database Watch the Six JavaOne Key Announcement Webcasts anywhere you can access the Internet and learn more about: Plans for advancing the Java Platform, Standard Edition (Java SE) and an update on Java SE 8 Plans announced for the evolution Java Platform, Micro Edition Availability of JavaFX 2.0 The NetBeans IDE Availability for Windows, Mac, Linux and Oracle Solaris Latest developments in the evolution of Java Platform, Enterprise Edition (Java EE). Oracle Java Cloud Service. Follow other informative, on-demand sessions on Oracle Hardware and Software on topics like Cloud, Exadata, Exalogic, Exalytics, Big Data Appliance, Enterprise Manager 12c, Hardware - SuperCluster, Server - and Storage, Oracle Fusion Applications Register now!

    Read the article

  • Save the dates &ndash; Tech.Days 2011 23rd to 25th of May in London

    - by Eric Nelson
    In May Microsoft UK (and specifically my group) will be delivering Tech.Days – a week of day long technical events plus evening activities. We will be covering Windows Phone 7, Silverlight, IE 9, Windows Azure Platform and more. I’m working right now on the details of what we will be covering around the Windows Azure Platform – and it is shaping up very nicely. There is a little more detail over on TechNet – but for the moment, keep the dates clear if you can. P.S. I think the above is called a “teaser” in marketing speak.

    Read the article

  • Cloud Builder Event Series Continues Around the World

    - by Sandra Cheevers
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Are you building an enterprise Cloud?  Make sure you attend a Cloud Builder Summit at one of many worldwide locations.  Designed for executives, cloud architects, and IT operations professionals, this event will eventually reach over 100 cities around the globe. This free, live event features demonstrations of how to build an enterprise cloud.  Learn how to fast-track applications to the Cloud with Oracle, and support every aspect of architecting, planning, deploying, monitoring and managing enterprise clouds.    Here's a photo from one of the CloudBuilder events in Kuala Lumpur, Malaysia.

    Read the article

  • Podcast Show Notes: Old Habits Die Hard in the New SOA World

    - by OTN ArchBeat
    Like the previous series, the latest OTN ArchBeat Podcast program was also recorded in a hotel room just around the corner from Oracle OpenWorld in San Francisco just a few weeks ago. The gathered experts, all members of the OTN architect community, agreed to participate in an informal roundtable discussion of what's happening in Service Oriented Architecture. As you'll hear, the conversation ranged from the maturity of Service Oriented Architecture technology and tools, to the the lingering and typically self-imposed problems that can prevent organizations from realizing the full potential of SOA, to what SOA means in the era of *aaS, mobile computing, and big data. Hajo Normann, Torsten Winterberg, Ronald van Luttikhuizen, and Guido Schmutz (L-R) Hajo Normann, Torsten Winterberg, Danilo Schmeidel, and Lonneke Dikmans (L-R) The Panelists (Listed alphabetically) Lonneke Dikmans, Managing Partner at Vennster, Oracle ACE Director Ronald van Luttikuhuizen, Managing Partner at Vennster, Oracle ACE Director Hajo Normann, SOA & BPM Lead for ASG at Accenture, Oracle ACE Director Danilo Schmiedel, Solution Architect at Opitz Consulting Guido Schmutz, Technology Manager for SOA/BPM and Architecture Board at Trivadis, Oracle ACE Director Torsten Winterberg, Director of Strategy and Innnovation and head of SOA Competence Center at Opitz Consulting, Oracle ACE Director The Conversation Listen to Part 1: SOA technology and tools are mature, says this panel of experts, but why do some organizations still struggle to take full advantage of industrialized SOA? Listen to Part 2 (Nov 6): Human nature and a lack of trust among stakeholders can thwart successful SOA. Can a marketplace approach and social tools improve the situation? Listen to Part 3 (Nov 13): Do SOA stakeholders recognize the problems caused by poor communication among siloed service development teams? Coming Soon SOA and B2B: The authors of Getting Started with Oracle SOA B2B Integration: A Hands-On Tutorial discuss Business to Business capabilities in Oracle SOA Suite 11g. Be a Guest Producer for an ArchBeat Podcast Want to be a guest producer for an OTN ArchBeat podcast, put your topic and panelist suggestions in a comment on this post, or contact me at @OTNArchBeat.

    Read the article

  • Apress Deal of the day - 23/Feb/2011 - Ultra-Fast ASP.NET: Building Ultra-Fast and Ultra-Scalable Websites Using ASP.NET and SQL Server

    - by TATWORTH
    Today's $10 deal of the day at http://www.apress.com/info/dailydeal  is Ultra-Fast ASP.NET: Building Ultra-Fast and Ultra-Scalable Websites Using ASP.NET and SQL Server by Richard Kessig - ISBN 978-1-4302-2383-2 I won a copy of this book at 101 Books. Richard Kessig is an all-star member of forums.asp.net - see http://forums.asp.net/members/RickNZ.aspx - this book has been on before as deal of the day. If you did not get a copy then, I suggest getting it today. " Ultra-Fast ASP.NET provides a practical guide to building extremely fast and scalable web sites using ASP.NET and SQL Server. It strikes a balance between imparting usable advice and backing that advice up with supporting background information. $49.99 | Published Nov 2009 | Rick Kiessig"

    Read the article

  • Real World Nuget

    - by JoshReuben
    Why Nuget A higher level of granularity for managing references When you have solutions of many projects that depend on solutions of many projects etc à escape from Solution Hell. Links · Using A GUI (Package Explorer) to build packages - http://docs.nuget.org/docs/creating-packages/using-a-gui-to-build-packages · Creating a Nuspec File - http://msdn.microsoft.com/en-us/vs2010trainingcourse_aspnetmvcnuget_topic2.aspx · consuming a Nuget Package - http://msdn.microsoft.com/en-us/vs2010trainingcourse_aspnetmvcnuget_topic3 · Nuspec reference - http://docs.nuget.org/docs/reference/nuspec-reference · updating packages - http://nuget.codeplex.com/wikipage?title=Updating%20All%20Packages · versioning - http://docs.nuget.org/docs/reference/versioning POC Folder Structure POC Setup Steps · Install package explorer · Source o Create a source solution – configure output directory for projects (Project > Properties > Build > Output Path) · Package o Add assemblies to package from output directory (D&D)- add net folder o File > Export – save .nuspec files and lib contents <?xml version="1.0" encoding="utf-16"?> <package > <metadata> <id>MyPackage</id> <version>1.0.0.3</version> <title /> <authors>josh-r</authors> <owners /> <requireLicenseAcceptance>false</requireLicenseAcceptance> <description>My package description.</description> <summary /> </metadata> </package> o File > Save – saves .nupkg file · Create Target Solution o In Tools > Options: Configure package source & Add package Select projects: Output from package manager (powershell console) ------- Installing...MyPackage 1.0.0 ------- Added file 'NugetSource.AssemblyA.dll' to folder 'MyPackage.1.0.0\lib'. Added file 'NugetSource.AssemblyA.pdb' to folder 'MyPackage.1.0.0\lib'. Added file 'NugetSource.AssemblyB.dll' to folder 'MyPackage.1.0.0\lib'. Added file 'NugetSource.AssemblyB.pdb' to folder 'MyPackage.1.0.0\lib'. Added file 'MyPackage.1.0.0.nupkg' to folder 'MyPackage.1.0.0'. Successfully installed 'MyPackage 1.0.0'. Added reference 'NugetSource.AssemblyA' to project 'AssemblyX' Added reference 'NugetSource.AssemblyB' to project 'AssemblyX' Added file 'packages.config'. Added file 'packages.config' to project 'AssemblyX' Added file 'repositories.config'. Successfully added 'MyPackage 1.0.0' to AssemblyX. ============================== o Packages folder created at solution level o Packages.config file generated in each project: <?xml version="1.0" encoding="utf-8"?> <packages>   <package id="MyPackage" version="1.0.0" targetFramework="net40" /> </packages> A local Packages folder is created for package versions installed: Each folder contains the downloaded .nupkg file and its unpacked contents – eg of dlls that the project references Note: this folder is not checked in UpdatePackages o Configure Package Manager to automatically check for updates o Browse packages - It automatically picked up the updates Update Procedure · Modify source · Change source version in assembly info · Build source · Open last package in package explorer · Increment package version number and re-add assemblies · Save package with new version number and export its definition · In target solution – Tools > Manage Nuget Packages – click on All to trigger refresh , then click on recent packages to see updates · If problematic, delete packages folder Versioning uninstall-package mypackage install-package mypackage –version 1.0.0.3 uninstall-package mypackage install-package mypackage –version 1.0.0.4 Dependencies · <?xml version="1.0" encoding="utf-16"?> <package xmlns="http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd"> <metadata> <id>MyDependentPackage</id> <version>1.0.0</version> <title /> <authors>josh-r</authors> <owners /> <requireLicenseAcceptance>false</requireLicenseAcceptance> <description>My package description.</description> <dependencies> <group targetFramework=".NETFramework4.0"> <dependency id="MyPackage" version="1.0.0.4" /> </group> </dependencies> </metadata> </package> Using NuGet without committing packages to source control http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages Right click on the Solution node in Solution Explorer and select Enable NuGet Package Restore. — Recall that packages folder is not part of solution If you get downloading package ‘Nuget.build’ failed, config proxy to support certificate for https://nuget.org/api/v2/ & allow unrestricted access to packages.nuget.org To test connectivity: get-package –listavailable To test Nuget Package Restore – delete packages folder and open vs as admin. In nugget msbuild: <Import Project="$(SolutionDir)\.nuget\nuget.targets" /> TFSBuild Integration Modify Nuget.Targets file <RestorePackages Condition="  '$(RestorePackages)' == '' "> True </RestorePackages> … <PackageSource Include="\\IL-CV-004-W7D\Packages" /> Add System Environment variable EnableNuGetPackageRestore=true & restart the “visual studio team foundation build service host” service. Important: Ensure Network Service has access to Packages folder Nugetter TFS Build integration Add Nugetter build process templates to TFS source control For Build Controller - Specify location of custom assemblies Generate .nuspec file from Package Explorer: File > Export Edit the file elements – remove path info from src and target attributes <?xml version="1.0" encoding="utf-16"?> <package xmlns="http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd">     <metadata>         <id>Common</id>         <version>1.0.0</version>         <title />         <authors>josh-r</authors>         <owners />         <requireLicenseAcceptance>false</requireLicenseAcceptance>         <description>My package description.</description>         <dependencies>             <group targetFramework=".NETFramework3.5" />         </dependencies>     </metadata>     <files>         <file src="CommonTypes.dll" target="CommonTypes.dll" />         <file src="CommonTypes.pdb" target="CommonTypes.pdb" /> … Add .nuspec file to solution so that it is available for build: Dev\NovaNuget\Common\NuSpec\common.1.0.0.nuspec Add a Build Process Definition based on the Nugetter build process template: Configure the build process – specify: · .sln to build · Base path (output directory) · Nuget.exe file path · .nuspec file path Copy DLLs to a binary folder 1) Set copy local for an assembly reference to false 2)  MSBuild Copy Task – modify .csproj file: http://msdn.microsoft.com/en-us/library/3e54c37h.aspx <ItemGroup>     <MySourceFiles Include="$(MSBuildProjectDirectory)\..\SourceAssemblies\**\*.*" />   </ItemGroup>     <Target Name="BeforeBuild">     <Copy SourceFiles="@(MySourceFiles)" DestinationFolder="bin\debug\SourceAssemblies" />   </Target> 3) Set Probing assembly search path from app.config - http://msdn.microsoft.com/en-us/library/823z9h8w(v=vs.80).aspx -                 <?xml version="1.0" encoding="utf-8" ?> <configuration>   <runtime>     <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">       <probing privatePath="SourceAssemblies"/>     </assemblyBinding>   </runtime> </configuration> Forcing 'copy local = false' The following generic powershell script was added to the packages install.ps1: param($installPath, $toolsPath, $package, $project) if( $project.Object.Project.Name -ne "CopyPackages") { $asms = $package.AssemblyReferences | %{$_.Name} foreach ($reference in $project.Object.References) { if ($asms -contains $reference.Name + ".dll") { $reference.CopyLocal = $false; } } } An empty project named "CopyPackages" was added to the solution - it references all the packages and is the only one set to CopyLocal="true". No MSBuild knowledge required.

    Read the article

  • Analytics in an Omni-Channel World

    - by David Dorf
    Retail has been around ever since mankind started bartering.  The earliest transactions were very specific to the individuals buying and selling, then someone had the bright idea to open a store.  Those transactions were a little more generic, but the store owner still knew his customers and what they wanted.  As the chains rolled out, customer intimacy was sacrificed for scale, and retailers began to rely on segments and clusters.  But thanks to the widespread availability of data and the technology to convert said data into information, retailers are getting back to details. The retail industry is following a maturity model for analytics that is has progressed through five stages, each delivering more value than the previous. Store Analytics Brick-and-mortar retailers (and pure-play catalogers as well) that collect anonymous basket-level data are able to get some sense of demand to help with allocation decisions.  Promotions and foot-traffic can be measured to understand marketing effectiveness and perhaps focus groups can help test ideas.  But decisions are influenced by the majority, using faceless customer segments and aggregated industry data points.  Loyalty programs help a little, but in many cases the cost outweighs the benefits. Web Analytics The Web made it much easier to collect data on specific, yet still anonymous consumers using cookies to track visits. Clickstreams and product searches are analyzed to understand the purchase journey, gauge demand, and better understand up-selling opportunities.  Personalization begins to allow retailers target market consumers with recommendations. Cross-Channel Analytics This phase is a minor one, but where most retailers probably sit today.  They are able to use information from one channel to bolster activities in another. However, there are technical challenges combining data silos so its not an easy task.  But for those retailers that are able to perform analytics on both sources of data, the pay-off is pretty nice.  Revenue per customer begins to go up as customers have a better brand experience. Mobile & Social Analytics Big data technologies are enabling a 360-degree view of the customer by incorporating psychographic data from social sites alongside traditional demographic data.  Retailers can track individual preferences, opinions, hobbies, etc. in order to understand a consumer's motivations.  Using mobile devices, consumers can interact with brands anywhere, anytime, accessing deep product information and reviews.  Mobile, combined with a loyalty program, presents an opportunity to put shopping into geographic context, understanding paths to the store, patterns within the store, and be an always-on advertising conduit. Omni-Channel Analytics All this data along with the proper technology represents a new paradigm in which the clock is turned back and retail becomes very personal once again.  Rich, individualized data better illuminates demand, allows for highly localized assortments, and helps tailor up-selling.  Interactions with all channels help build an accurate profile of each consumer, and allows retailers to tailor the retail experience to meet the heightened expectations of today's sophisticated shopper.  And of course this culminates in greater customer satisfaction and business profitability.

    Read the article

  • The Calmest IT Guy in the World

    - by Markus Weber
    Unplanned outages and downtime still result not only in major productivity losses, but also major financial losses. Along the same lines, if zero planned downtime and zero data loss are key to your IT environment and your business requirements, planning for those becomes very important, all while balancing between performance, high availability, and cost. Oracle Database High Availability technologies will help you achieve these mission-critical goals, and are reflected in Oracle's best practices offerings of the Maximum Availability Architecture, or MAA. We created three neat, short videos showcasing some typical use cases, and highlighting three important components (amongst many more) of MAA: Oracle Real Application Clusters (RAC) Oracle Active Data Guard Oracle Flashback Technologies Make sure to watch those videos here, and learn about challenges, and solutions, around High Availability database environments from a recent Independent Oracle Users Group (IOUG) survey. 

    Read the article

< Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >