Search Results

Search found 6431 results on 258 pages for 'pthread join'.

Page 83/258 | < Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >

  • Oracle Cloud and Oracle Platinum Services Announcements

    - by kellsey.ruppel
    Live Webcast - Oracle Cloud and Oracle Platinum Services Announcements Wednesday, June 06, 2012 1:00 p.m. PT – 2:30 p.m. PT View your local time Live Webcast Register to watch at your desk! Don't have an Oracle account? Sign up now!  Why do I need an account? Register Now! Please join Larry Ellison and Mark Hurd for important Oracle announcements. Be among the first to learn about new developments in Oracle’s cloud strategy and game-changing advances in Oracle Support.  Register Now! Are you based in the San Francisco Bay Area? Register to attend the live event in Redwood Shores. Oracle values your privacy, and will treat the information we collect from you as a result of your registration and participation in this activity in accordance with the Oracle Privacy Policy. Event Details: Wednesday, June 06, 2012 1:00 p.m. PT – 2:30 p.m. PT Live Webcast Stay Connected:     Join the conversation: #oraclecloud #oraclesupport

    Read the article

  • Best Party of 2011: Introducing Java 7

    - by Tori Wieldt
    As a member of the Java community, you played a critical role in building Java 7. You contributed great ideas for new features and new ways of working and collaborating to take the next step in development. And now, it’s time to celebrate with a global gathering of the Java community—online and live. See your ideas at work. Hear about everything Java 7 can do for you and how we’re moving Java forward together. Join us for celebrations in Redwood Shores, São Paulo, or London—as we unveil the latest innovations in Java 7. The three events will be joined with each other by satellite, and will be available as a webcast if you can't attend the live events. Learn from fellow developers around the globe who are getting the most out of the new features. Get overviews from the Java experts on Project Coin, the Fork/Join framework, the new file system API, improvements to the VM, and a panel discussion with Q & A. Thursday, July 07, 2011 Redwood Shores, United States: 9:00 a.m. PT - 1:30pm PT São Paulo, Brazil: 1:00 p.m BRT London, England: 5:00 p.m. BST Live Webcast: 9:00 a.m. PT - 1:30pm PT  Get more information about the July 7 events. You need to register for the live events or webcast. There will also be other celebrations at Java User Group (JUG) meetings for the next few months.Find your local JUG. Follow the conversation on Twitter: follow @Java and use #java7 Java is moving forward, let's party!

    Read the article

  • Two Upcoming Server Virtualization Webcasts

    - by Chris Kawalek
    We have a couple of interesting server virtualization webcasts coming up that you might be interested in. Have a look:  Webcast 1: October 23rd, 9 am PST Virtualized Infrastructure Simplified with Oracle VM and NetApp Storage and Data Management Solutions Point-and-Click Interface Deploys Virtualized Data Infrastructure in Minutes  Provisioning and deploying a virtual data infrastructure can be costly, time-consuming, and prone to error. Oracle VM and NetApp joint solutions, however, give you a single point-and-click interface to deploy your virtualized data infrastructure seamlessly in minutes. Join us in this live webcast to learn more from product experts and view a product demo. Register (for free!) here.  Webcast 2: November 7th, 9 am PST Report Shows Oracle VM Up to 10x Faster than VMware vSphere 5 in Time to Deployment Time is your IT department’s greatest commodity. So when a new report reveals that your IT staff can deploy Oracle Real Application Clusters (Oracle RAC) up to 10 times faster than a traditional install performed with VMware vSphere 5, it’s newsworthy. Join us in this live webcast to learn how you can realize your time savings. Register (for free!) here. 

    Read the article

  • Find Thousands of Oracle Jobs on oDesk

    - by Brandye Barrington
    We are happy to announce we have teamed up with oDesk, the world’s largest and fastest-growing online workplace, to bring thousands of job opportunities to the Oracle Certified community.  On oDesk, skilled independent professionals can tap into global demand for their skills by accessing hundreds of thousands of job opportunities around the world—more than 444,000 jobs were posted on oDesk in Q2 2012 alone.  And with the freedom to work whenever and wherever they like, on the projects they choose and at the rate they set, oDesk contractors are building their online reputations and taking control of their careers—oDesk data shows that contractors increase their rates by an average of 190% over three years. And with oDesk’s new Oracle Certified Group, contractors can set themselves apart by showcasing an Oracle Certified badge on their profile, giving them a competitive advantage when they apply to the thousands of open Oracle jobs on oDesk.  oDesk is free to join—as is the Oracle Certified Group—and guarantees payment for hourly work. With more than 480,000 businesses from around the world registered on the platform, professionals have a wide range of jobs to choose from, including those that require MySQL, Java, and many other types of Oracle skills. Learn more about Oracle job opportunities and join the Certified Group on oDesk here.

    Read the article

  • share distribution question

    - by facebook-100000781341887
    Hi, I just developed a facebook game(mifia like), but the graphic I make is not good, because it is reference with some existing photo, trace with AI, and coloring it. Therefore, I invite my friend to join me, he is a graphic designer, own a company with his friend (I know both of them), for the share, I expect at least 70% for me, and at most 30% for them (both of them want to join). Therefore, they give me a counter offer, 60% for me and 40% for them, of course, I feel their counter offer is unacceptable because they only build the image in part time, and all the other work just like coding, webhosting...etc, is what I do in full time. Why they said they worth 40% is that they will make a good graphic, they can provide a advertise channel(on local magazine), etc... Actually, I don't think the game need advertisement on local magazine because the game is not target for local... Please give me some comments on this issue(is the share fair? what is the importance of the image of the game, is it worth more than 30%), or can anyone share the experience on this. Thanks in advance.

    Read the article

  • MOS Community rewards Ram Kasthuri w/ FREE OOW Pass!

    - by cwarticki
    Congratulations Ram Kasthuri on Receiving a Free Full Conference Pass to Oracle OpenWorld!  Thank you for helping other members through your participation in My Oracle Support Community My Oracle Support Community member Ram Kasthuri received a free Oracle OpenWorld Pass from the My Oracle Support Community in appreciation for his work in answering questions posted by other Community members. Ram, an independent consultant, is an Application Solution Architect with Canon. He has been a valued Oracle customer for over 13 years. Ram is an active member in several of the Oracle EBS communities. He has achieved the Expert Level of recognition through his active participation.   Ram described the value he receives from My Oracle Support Community when he said what “I like best about the communities is the vicarious learning from real business scenarios posted by other Community members. The questions are real opportunities to learn all things Oracle, and EBS especially.” Ram is one of those member's who answers more questions than he posts, so he must get a lot of that vicarious learning. Oracle Premier Support customers can get answers and learn from both peers who have faced similar situations and Oracle experts. Join us in My Oracle Support Community. Look for Ram this week at Oracle OpenWorld and join him in My Oracle Support Community when you return to work. And while you’re at Oracle OpenWorld, Oracle Customer Support Services invites you to expand your knowledge by meeting with Oracle Support experts. Learn more about our sessions and network opportunities today!

    Read the article

  • Oracle Endeca "Getting Started" Partner Guide

    - by Grant Schofield
    For partners looking for a concise step by step guide to getting started with Oracle Endeca Information Discovery, here it is to help you get started as quickly as possible. Step 1: Join the Knowledge Zone as a company and an individual - this will give you a) the right to resell Oracle Endeca ID, and b) notice of any free / subsidised training events in your region Step 2: For a quick general overview & positioning see the following article, in particular the Agile BI Video series which are useful in sharing with prospective clients. Also find a link to the official OEID Data Sheet. Step 3: For a more detailed overview there is a live recorded OEID partner webcast with downloadable slides. In conjunction with this, your sales / presales team have free access to the official OEID Partner Playbook as well as the full Oracle price book. Step 4: Download the OEID software and install. Please be aware you will need a 64-bit machine & a 64-bit Operating System. A useful solution for partners that have a 32-bit Operating System is to use Oracle's free VirtualBox software to quickly and easily create a Linux image and install on that. Step 5: Attend a free / subsidised training event in your region. Please join the Knowledge Zone as an Individual (opt in) to be informed of these. We will also publish these via the blog Things are moving fast, so please be aware that the team are working hard to produce more and more material such as downloadable data sets (structured / unstructured), a downloadable image, access to demos, and over the next few weeks we will update this article as soon as new material becomes available!

    Read the article

  • How to remove the last character from Stringbuilder

    - by hmloo
    We usually use StringBuilder to append string in loops and make a string of each data separated by a delimiter. but you always end up with an extra delimiter at the end. This code sample shows how to remove the last delimiter from a StringBuilder. using System; using System.Collections.Generic; using System.Text; using System.Linq; class Program { static void Main() { var list =Enumerable.Range(0, 10).ToArray(); StringBuilder sb = new StringBuilder(); foreach(var item in list) { sb.Append(item).Append(","); } sb.Length--;//Just reduce the length of StringBuilder, it's so easy Console.WriteLine(sb); } } //Output : 0,1,2,3,4,5,6,7,8,9 Alternatively,  we can use string.Join for the same results, please refer to blow code sample. using System; using System.Collections.Generic; using System.Text; using System.Linq; class Program { static void Main() { var list = Enumerable.Range(0, 10).Select(n => n.ToString()).ToArray(); string str = string.Join(",", list); Console.WriteLine(str); } }

    Read the article

  • Reformatting and version control

    - by l0b0
    Code formatting matters. Even indentation matters. And consistency is more important than minor improvements. But projects usually don't have a clear, complete, verifiable and enforced style guide from day 1, and major improvements may arrive any day. Maybe you find that SELECT id, name, address FROM persons JOIN addresses ON persons.id = addresses.person_id; could be better written as / is better written than SELECT persons.id, persons.name, addresses.address FROM persons JOIN addresses ON persons.id = addresses.person_id; while working on adding more columns to the query. Maybe this is the most complex of all four queries in your code, or a trivial query among thousands. No matter how difficult the transition, you decide it's worth it. But how do you track code changes across major formatting changes? You could just give up and say "this is the point where we start again", or you could reformat all queries in the entire repository history. If you're using a distributed version control system like Git you can revert to the first commit ever, and reformat your way from there to the current state. But it's a lot of work, and everyone else would have to pause work (or be prepared for the mother of all merges) while it's going on. Is there a better way to change history which gives the best of all results: Same style in all commits Minimal merge work ? To clarify, this is not about best practices when starting the project, but rather what should be done when a large refactoring has been deemed a Good Thing™ but you still want a traceable history? Never rewriting history is great if it's the only way to ensure that your versions always work the same, but what about the developer benefits of a clean rewrite? Especially if you have ways (tests, syntax definitions or an identical binary after compilation) to ensure that the rewritten version works exactly the same way as the original?

    Read the article

  • JSR Updates and EC Meeting Tuesday @ 15:00 PST

    - by Heather VanCura
    JSR 310, Date and Time API, has moved to JCP 2.9 (first JCP 2.9 JSR!) JSR 236, Concurrency Utilities for Java EE, has published an Early Draft Review. This review ends 15 December 2012.  Tomorrow, Tuesday 20 November is the last Public EC Meeting of 2012, and the first EC meeting with the merged EC. The second hour of this meeting will be open to the public at 3:00 PM PST. The agenda includes  JSR 355,  EC merge implementation report, JSR 358 (JCP.next.3) status report, JCP 2.8 status update and community audit program.  Details are below. We hope you will join us, but if you cannot attend, not to worry--the recording and materials will also be public on the JCP.org multimedia page. Meeting details Date & Time Tuesday November 20, 2012, 3:00 - 4:00 pm PST Location Teleconference Dial-in +1 (866) 682-4770 (US) Conference code: 627-9803 Security code: 52732 ("JCPEC" on your phone handset) For global access numbers see http://www.intercall.com/oracle/access_numbers.htm Or +1 (408) 774-4073 WebEx Browse for the meeting from https://jcp.webex.com No registration required (enter your name and email address) Password: JCPEC Agenda JSR 355 (the EC merge) implementation report JSR 358 (JCP.next.3) status report 2.8 status update and community audit program Discussion/Q&A Note The call will be recorded and the recording published on jcp.org, so those who are unable to join in real-time will still be able to participate.

    Read the article

  • How much ethical is to accept second offer while serving notice period for first company? [closed]

    - by iammilind
    I had got an offer from a company X 2 months back. Though the compensation was pretty good, I asked for more. They declined and I continued my search for the best. I was expecting good companies (Y, Z) will contact me, but that din't happen. Then 2 months later, now I re-approached company X again and they issued me the same offer. I accepted it, resigned from my current company and serving the notice period (2 months). Now the twist: I got calls from Y and Z (good companies). They definitely give better compensation. If they select me then my plan is to accept the better offer from Y/Z and inform the X immediately that I have got better opportunity, so I will not be able to join the company... Is this ethical ? [Edit Note: I have to just send an acceptance email (i.e. no contract signing) which doesn't legally bind me to join that company. So, legally I am not doing anything wrong. However, I am worried that if I am doing right/wrong ethically.]

    Read the article

  • Oracle Transportation Management (Lead) Functional Consultant in Germany

    - by user769227
    My name is Giovanni and I lead the practice of OTM (Oracle Transportation Management) consultants in Western Europe. I currently have a role open for an OTM Lead Consultant to join my international team in Germany. Oracle Transportation Management is the leading TMS application software in the market, as confirmed by Gartner’s classification as LEADER of its TMS Magic Quadrant with the highest rating among vendors. The OTM Consulting practice is a team of OTM functional and technical specialists located across Europe whose broad objective is to assist companies in the implementation of their TMS solution based on OTM. These companies are leading Shippers of various industries and Logistic Service Providers. Key requirements for this role are: relevant experience with Supply Chain or Transportation Management in other consulting organizations or large enterprises, the drive to learn the leading TMS application software in today’s market and the interest to join a truly international team. We offer the opportunity to work for a leader of the IT Industry and assist international clients to realize their business transformation initiatives through innovation. If you have an entrepreneurial spirit, and are you looking for a work culture where innovation is the goal, hard work is expected, and creativity is rewarded then please visit this link for more information.

    Read the article

  • The Oracle Architects Training: 40 training sessions for our EMEA partners to build their Oracle Applications and Technical skills

    - by Richard Lefebvre
    There is a lot more to Oracle technology than meets the eye. Sure, you already belong to a small circle of our most experienced and committed partners. But are you making the best use possible of our technology solutions? Put it to the test. Join the “Oracle Partner Architects Training”. It is aimed at providing your experts, architects and consultants with in-depth architectural knowledge about Oracle technology. Here is your chance to learn from the best. Seasoned speakers, exclusive content and no product marketing. Oracle technology beyond the obvious. Mark your calendar The Oracle Partner Architects Training is an online training program. Sign up for the live Webex sessions (scheduled from January 2013 till April 2013) or watch replays as they become available. Feel free to follow training sessions at your own pace. Also, last year’s sessions are still very accurate and very available on architects.oraevents.eu NOTE: Looking to get your consultants Oracle certified? One more reason to join the Oracle Partner Architects Training. It is the fast track to getting their expertise validated with an Oracle certificate.

    Read the article

  • Leaving my ongoing internship for a better company

    - by AnonymousAsExpected
    Hi. I am currently employed as an intern with a software company X. I had no particular interest in working on their line of work, but since I had no other alternative, I had to join it. I had also applied to a good company Y, which has a better name than X and the work is exactly what I wanted. More than anything, the mention of Y on my resume will make a big difference later. I really had no idea that Y would accept me, since they were not replying to my repeated requests to inform me of the status of my application. Now fifteen days into my internship with X, Y sent me an offer letter to join. What should I do now? There are other interns working with X on the same project I was, and so of course the work won't suffer. My problem is, I want to quit X, but how do I do it in the most polite way possible? I am afraid to ask my boss, he is rude already and I fear he might take it badly. And how do I handle this in my resume? If two years down the line someone asks me why I left X for Y, won't that look bad? I am really confused, working with Y would be like a dream come true. But I fear the negative impact of leaving X. Hoping that someone can perhaps share a similar experience.

    Read the article

  • eSTEP TechCast - November 2012

    - by uwes
    Dear partner, we are pleased to announce our next eSTEP TechCast on Thursday 1st of November and would be happy if you could join. Please see below the details for the next TechCast.Date and time:Thursday, 01. November 2012, 11:00 - 12:00 GMT (12:00 - 13:00 CET; 15:00 - 16:00 GST) Title:  Update on Solaris 11 Abstract:Oracle announced an update on Solaris 11 on Oracle OpenWorld. Join us in this call to get a brief summary of the enhancements and how they help to build better solutions for your customers. Target audience: Tech Presales Speaker: Jörg Möllenkamp Call Info:Call-in-toll-free number: 08006948154 (United Kingdom)Call-in-toll-free number: +44-2081181001 (United Kingdom) Show global numbers Conference Code: 803 594 3Security Passcode: 9876Webex Info (Oracle Web Conference) Meeting Number: 599 775 167Meeting Password: tech2011 Playback / Recording / Archive: The webcasts will be recorded and will be available shortly after the event in the eSTEP portal under the Events tab, where you could find also material from already delivered eSTEP TechCasts. Use your email-adress and PIN: eSTEP_2011 to get access. Feel free to have a look. We are happy to get your comments and feedback. Thanks and best regards, Partner HW Enablement EMEA

    Read the article

  • Create Math Game with PHP, Ajax, Jquery

    - by Sambucasun
    I am developing a website where user can create their own game which can be joined by other users as well. It's a simple maths game which will shoot equations based on time or count specified. I want that moment user create a game, it will be listed in "current Games" section. Other users can check out the list and select the game to join. After game is created, creater should have a screen which should be having his name with display pic. Now gradually as others start joining the game, list should updated automatically. Once enough users are there i will start the game. The same list should be displayed to other users who join the game. Once game is over all will be displayed a summary list. I have gone through couple of threads but could not get clear idea. Do I need to use comet or other technology to create such game or simple PHP, Ajax or Jquery will suffice ? Also I want my website should be mobile compatible so i am designing it in html5. If i create this game using just Ajax then will there be any performance issue while playing through mobile. I am not much experienced so just need guidance for what should be appropriate or use for my requirement.

    Read the article

  • How to match responses from a server with their corresponding requests? [closed]

    - by Deele
    There is a server that responds to requests on a socket. The client has functions to emit requests and functions to handle responses from the server. The problem is that the request sending function and the response handling function are two unrelated functions. Given a server response X, how can I know whether it's a response to request X or some other request Y? I would like to make a construct that would ensure that response X is definitely the answer to request X and also to make a function requestX() that returns response X and not some other response Y. This question is mostly about the general programming approach and not about any specific language construct. Preferably, though, the answer would involve Ruby, TCP sockets, and PHP. My code so far: require 'socket' class TheConnection def initialize(config) @config = config end def send(s) toConsole("--> #{s}") @conn.send "#{s}\n", 0 end def connect() # Connect to the server begin @conn = TCPSocket.open(@config['server'], @config['port']) rescue Interrupt rescue Exception => detail toConsole('Exception: ' + detail.message()) print detail.backtrace.join('\n') retry end end def getSpecificAnswer(input) send "GET #{input}" end def handle_server_input(s) case s.strip when /^Hello. (.*)$/i toConsole "[ Server says hello ]" send "Hello to you too! #{$1}" else toConsole(s) end end def main_loop() while true ready = select([@conn, $stdin], nil, nil, nil) next if !ready for s in ready[0] if s == $stdin then return if $stdin.eof s = $stdin.gets send s elsif s == @conn then return if @conn.eof s = @conn.gets handle_server_input(s) end end end end def toConsole(msg) t = Time.new puts t.strftime("[%H:%M:%S]") + ' ' + msg end end @config = Hash[ 'server'=>'test.server.com', 'port'=>'2020' ] $conn = TheConnection.new(@config) $conn.connect() $conn.getSpecificAnswer('itemsX') begin $conn.main_loop() rescue Interrupt rescue Exception => detail $conn.toConsole('Exception: ' + detail.message()) print detail.backtrace.join('\n') retry end

    Read the article

  • You cannot do cross joins in SQL Azure but there is a way around that....

    - by SeanBarlow
    So I was asked today how to do cross joins in SQL Azure using Linq. Well the simple answer is you cant do it. It is not supported but there are ways around that. The solution is actually very simple and easy to implement. So here is what I did and how I did it. I created two SQL Azure Databases. The first Database is called AccountDb and has a single table named Account, which has an ID, CompanyId and Name in it. The second database I called CompanyDb and it contains two tables. The first table I named Company and the second I named Address. The Company Table has an Id and Name column. The Address Table has an Id and CompanyId columns. Since we cannot do cross joins in Azure we have to have one of the models preloaded with data. I simply put the Accounts into a List of accounts and use that in my join.   var accounts = new AccountsModelContainer().Accounts.ToList(); var companies = new CompanyModelContainer().Companies; var query = from account in accounts             join company in                 (                       from c in companies                      select c                  ) on account.CompanyId equals company.Id             select new AccountView() {                                               AccountName = account.Name, CompanyName = company.Name,                                 Addresses = company.Addresses                         }; return query.ToList();   So as long as you have your data loaded from one of the contexts you can still execute your queries and get the data back that you want.

    Read the article

  • Reportviewer stored procedure [closed]

    - by Liesl
    I want to write a stored procedure for my invoice reportviewer. After invoice is generated in reportviewer it must also add the data to my Invoice table. This is all my tables in my database: CREATE TABLE [dbo].[Waybills]( [WaybillID] [int] IDENTITY(1,1) NOT NULL, [SenderName] [varchar](50) NULL, [SenderAddress] [varchar](50) NULL, [SenderContact] [int] NULL, [ReceiverName] [varchar](50) NULL, [ReceiverAddress] [varchar](50) NULL, [ReceiverContact] [int] NULL, [UnitDescription] [varchar](50) NULL, [UnitWeight] [int] NULL, [DateReceived] [date] NULL, [Payee] [varchar](50) NULL, [CustomerID] [int] NULL, PRIMARY KEY CLUSTERED CREATE TABLE [dbo].[Customer]( [CustomerID] [int] IDENTITY(1,1) NOT NULL, [customerName] [varchar](30) NULL, [CustomerAddress] [varchar](30) NULL, [CustomerContact] [varchar](30) NULL, [VatNo] [int] NULL, CONSTRAINT [PK_Customer] PRIMARY KEY CLUSTERED ) CREATE TABLE [dbo].[Cycle]( [CycleID] [int] IDENTITY(1,1) NOT NULL, [CycleNumber] [int] NULL, [StartDate] [date] NULL, [EndDate] [date] NULL ) ON [PRIMARY] CREATE TABLE [dbo].[Payments]( [PaymentID] [int] IDENTITY(1,1) NOT NULL, [Amount] [money] NULL, [PaymentDate] [date] NULL, [CustomerID] [int] NULL, PRIMARY KEY CLUSTERED Create table Invoices ( InvoiceID int IDENTITY(1,1), InvoiceNumber int, InvoiceDate date, BalanceBroughtForward money, OutstandingAmount money, CustomerID int, WaybillID int, PaymentID int, CycleID int PRIMARY KEY (InvoiceID), FOREIGN KEY (CustomerID) REFERENCES Customer(CustomerID), FOREIGN KEY (WaybillID) REFERENCES Waybills(WaybillID), FOREIGN KEY (PaymentID) REFERENCES Payments(PaymentID), FOREIGN KEY (CycleID) REFERENCES Cycle(CycleID) ) I want my sp to find all waybills for specific customer in a specific cycle with payments made from this client. All this data must then be added into the INVOICE table. Can someone please help me or show me on the right direction? create proc GenerateInvoice @StartDate date, @EndDate date, @Payee varchar(30) AS SELECT Waybills.WaybillNumber Waybills.SenderName, Waybills.SenderAddress, Waybills.SenderContact, Waybills.ReceiverName, Waybills.ReceiverAddress, Waybills.ReceiverContact, Waybills.UnitDescription, Waybills.UnitWeight, Waybills.DateReceived, Waybills.Payee, Payments.Amount, Payments.PaymentDate, Cycle.CycleNumber, Cycle.StartDate, Cycle.EndDate FROM Waybills CROSS JOIN Payments CROSS JOIN Cycle WHERE Waybills.ReceiverName = @Payee AND (Waybills.DateReceived BETWEEN (@StartDate) AND (@EndDate)) Insert Into Invoices (InvoiceNumber, InvoiceDate, BalanceBroughtForward, OutstandingAmount) Values (@InvoiceNumber, @InvoiceDate, @BalanceBroughtForward, @ OutstandingAmount) go

    Read the article

  • [LINQ] Master&ndash;Detail Same Record (I)

    - by JTorrecilla
    PROBLEM Firstly, I am working on a project based on LINQ, EF, and C# with VS2010. The following Table shows what I have and what I want to show. Header C1 C2 C3 1 P1 01/01/2011 2 P2 01/02/2011 Details 1 1 D1 2 1 D2 3 1 D3 4 2 D1 5 2 D4 Expected Results 1 P1 01/01/2011 D1, D2, D3 2 P2 01/02/2011 D1,D4   IDEAS At the begin I got 3 possible ways: - Doing inside the DB:  It could be achieved from DB with a CURSOR in a Stored Procedure. - Doing from .NET with LOOPS. - Doing with LINQ (I love it!!) FIRST APROX Example with a simple CLASS with a LIST: With and Employee Class that acts as Header Table: 1: public class Employee 2: { 3: public Employee () { } 4: public Int32 ID { get; set; } 5: public String FirstName{ get; set; } 6: public String LastName{ get; set; } 7: public List<string> Numbers{ get; set; } // Acts as Details Table 8: } We can show all numbers contained by Employee: 1: List<Employee > listado = new List<Employee >(); 2: //Fill Listado 3: var query= from Employee em in listado 4: let Nums= string.Join(";", em.Numbers) 5: select new { 6: em.Name, 7: Nums 8: }; The “LET” operator allows us to host the results of “Join” of the Numbers List of the Employee Class. A little example. ASAP I will post the second part to achieve the same with Entity Framework. Best Regards

    Read the article

  • How to generate DELETE statements in PL/SQL, based on the tables FK relations?

    - by The chicken in the kitchen
    Is it possible via script/tool to generate authomatically many delete statements based on the tables fk relations, using Oracle PL/SQL? In example: I have the table: CHICKEN (CHICKEN_CODE NUMBER) and there are 30 tables with fk references to its CHICKEN_CODE that I need to delete; there are also other 150 tables foreign-key-linked to that 30 tables that I need to delete first. Is there some tool/script PL/SQL that I can run in order to generate all the necessary delete statements based on the FK relations for me? (by the way, I know about cascade delete on the relations, but please pay attention: I CAN'T USE IT IN MY PRODUCTION DATABASE, because it's dangerous!) I'm using Oracle DataBase 10G R2. This is the result I've written, but it is not recursive: This is a view I have previously written, but of course it is not recursive! CREATE OR REPLACE FORCE VIEW RUN ( OWNER_1, CONSTRAINT_NAME_1, TABLE_NAME_1, TABLE_NAME, VINCOLO ) AS SELECT OWNER_1, CONSTRAINT_NAME_1, TABLE_NAME_1, TABLE_NAME, '(' || LTRIM ( EXTRACT (XMLAGG (XMLELEMENT ("x", ',' || COLUMN_NAME)), '/x/text()'), ',') || ')' VINCOLO FROM ( SELECT CON1.OWNER OWNER_1, CON1.TABLE_NAME TABLE_NAME_1, CON1.CONSTRAINT_NAME CONSTRAINT_NAME_1, CON1.DELETE_RULE, CON1.STATUS, CON.TABLE_NAME, CON.CONSTRAINT_NAME, COL.POSITION, COL.COLUMN_NAME FROM DBA_CONSTRAINTS CON, DBA_CONS_COLUMNS COL, DBA_CONSTRAINTS CON1 WHERE CON.OWNER = 'TABLE_OWNER' AND CON.TABLE_NAME = 'TABLE_OWNED' AND ( (CON.CONSTRAINT_TYPE = 'P') OR (CON.CONSTRAINT_TYPE = 'U')) AND COL.TABLE_NAME = CON1.TABLE_NAME AND COL.CONSTRAINT_NAME = CON1.CONSTRAINT_NAME --AND CON1.OWNER = CON.OWNER AND CON1.R_CONSTRAINT_NAME = CON.CONSTRAINT_NAME AND CON1.CONSTRAINT_TYPE = 'R' GROUP BY CON1.OWNER, CON1.TABLE_NAME, CON1.CONSTRAINT_NAME, CON1.DELETE_RULE, CON1.STATUS, CON.TABLE_NAME, CON.CONSTRAINT_NAME, COL.POSITION, COL.COLUMN_NAME) GROUP BY OWNER_1, CONSTRAINT_NAME_1, TABLE_NAME_1, TABLE_NAME; ... and it contains the error of using DBA_CONSTRAINTS instead of ALL_CONSTRAINTS... Please pay attention to this: http://stackoverflow.com/questions/485581/generate-delete-statement-from-foreign-key-relationships-in-sql-2008/2677145#2677145 Another user has just written it in SQL SERVER 2008, anyone is able to convert to Oracle 10G PL/SQL? I am not able to... :-( This is the code written by another user in SQL SERVER 2008: DECLARE @COLUMN_NAME AS sysname DECLARE @TABLE_NAME AS sysname DECLARE @IDValue AS int SET @COLUMN_NAME = '<Your COLUMN_NAME here>' SET @TABLE_NAME = '<Your TABLE_NAME here>' SET @IDValue = 123456789 DECLARE @sql AS varchar(max) ; WITH RELATED_COLUMNS AS ( SELECT QUOTENAME(c.TABLE_SCHEMA) + '.' + QUOTENAME(c.TABLE_NAME) AS [OBJECT_NAME] ,c.COLUMN_NAME FROM PBANKDW.INFORMATION_SCHEMA.COLUMNS AS c WITH (NOLOCK) INNER JOIN PBANKDW.INFORMATION_SCHEMA.TABLES AS t WITH (NOLOCK) ON c.TABLE_CATALOG = t.TABLE_CATALOG AND c.TABLE_SCHEMA = t.TABLE_SCHEMA AND c.TABLE_NAME = t.TABLE_NAME AND t.TABLE_TYPE = 'BASE TABLE' INNER JOIN ( SELECT rc.CONSTRAINT_CATALOG ,rc.CONSTRAINT_SCHEMA ,lkc.TABLE_NAME ,lkc.COLUMN_NAME FROM PBANKDW.INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS rc WITH (NOLOCK) INNER JOIN PBANKDW.INFORMATION_SCHEMA.KEY_COLUMN_USAGE lkc WITH (NOLOCK) ON lkc.CONSTRAINT_CATALOG = rc.CONSTRAINT_CATALOG AND lkc.CONSTRAINT_SCHEMA = rc.CONSTRAINT_SCHEMA AND lkc.CONSTRAINT_NAME = rc.CONSTRAINT_NAME INNER JOIN PBANKDW.INFORMATION_SCHEMA.TABLE_CONSTRAINTS tc WITH (NOLOCK) ON rc.CONSTRAINT_CATALOG = tc.CONSTRAINT_CATALOG AND rc.CONSTRAINT_SCHEMA = tc.CONSTRAINT_SCHEMA AND rc.UNIQUE_CONSTRAINT_NAME = tc.CONSTRAINT_NAME INNER JOIN PBANKDW.INFORMATION_SCHEMA.KEY_COLUMN_USAGE rkc WITH (NOLOCK) ON rkc.CONSTRAINT_CATALOG = tc.CONSTRAINT_CATALOG AND rkc.CONSTRAINT_SCHEMA = tc.CONSTRAINT_SCHEMA AND rkc.CONSTRAINT_NAME = tc.CONSTRAINT_NAME WHERE rkc.COLUMN_NAME = @COLUMN_NAME AND rkc.TABLE_NAME = @TABLE_NAME ) AS j ON j.CONSTRAINT_CATALOG = c.TABLE_CATALOG AND j.CONSTRAINT_SCHEMA = c.TABLE_SCHEMA AND j.TABLE_NAME = c.TABLE_NAME AND j.COLUMN_NAME = c.COLUMN_NAME ) SELECT @sql = COALESCE(@sql, '') + 'DELETE FROM ' + [OBJECT_NAME] + ' WHERE ' + [COLUMN_NAME] + ' = ' + CONVERT(varchar, @IDValue) + CHAR(13) + CHAR(10) FROM RELATED_COLUMNS PRINT @sql Thank to Charles, this is the latest not working release of the software, I have added a parameter with the OWNER because the referential integrities propagate through about 5 other Oracle users (!!!): CREATE OR REPLACE PROCEDURE delete_cascade ( parent_table VARCHAR2, parent_table_owner VARCHAR2) IS cons_name VARCHAR2 (30); tab_name VARCHAR2 (30); tab_name_owner VARCHAR2 (30); parent_cons VARCHAR2 (30); parent_col VARCHAR2 (30); delete1 VARCHAR (500); delete2 VARCHAR (500); delete_command VARCHAR (4000); CURSOR cons_cursor IS SELECT constraint_name, r_constraint_name, table_name, constraint_type FROM all_constraints WHERE constraint_type = 'R' AND r_constraint_name IN (SELECT constraint_name FROM all_constraints WHERE constraint_type IN ('P', 'U') AND table_name = parent_table AND owner = parent_table_owner) AND delete_rule = 'NO ACTION'; CURSOR tabs_cursor IS SELECT DISTINCT table_name FROM all_cons_columns WHERE constraint_name = cons_name; CURSOR child_cols_cursor IS SELECT column_name, position FROM all_cons_columns WHERE constraint_name = cons_name AND table_name = tab_name; BEGIN FOR cons IN cons_cursor LOOP cons_name := cons.constraint_name; parent_cons := cons.r_constraint_name; SELECT DISTINCT table_name, owner INTO tab_name, tab_name_owner FROM all_cons_columns WHERE constraint_name = cons_name; delete_cascade (tab_name, tab_name_owner); delete_command := ''; delete1 := ''; delete2 := ''; FOR col IN child_cols_cursor LOOP SELECT DISTINCT column_name INTO parent_col FROM all_cons_columns WHERE constraint_name = parent_cons AND position = col.position; IF delete1 IS NULL THEN delete1 := col.column_name; ELSE delete1 := delete1 || ', ' || col.column_name; END IF; IF delete2 IS NULL THEN delete2 := parent_col; ELSE delete2 := delete2 || ', ' || parent_col; END IF; END LOOP; delete_command := 'delete from ' || tab_name_owner || '.' || tab_name || ' where (' || delete1 || ') in (select ' || delete2 || ' from ' || parent_table_owner || '.' || parent_table || ');'; INSERT INTO ris VALUES (SEQUENCE_COMANDI.NEXTVAL, delete_command); COMMIT; END LOOP; END; / In the cursor CONS_CURSOR I have added the condition: AND delete_rule = 'NO ACTION'; in order to avoid deletion in case of referential integrities with DELETE_RULE = 'CASCADE' or DELETE_RULE = 'SET NULL'. Now I have tried to turn from stored procedure to stored function, but the delete statements are not correct: CREATE OR REPLACE FUNCTION deletecascade ( parent_table VARCHAR2, parent_table_owner VARCHAR2) RETURN VARCHAR2 IS cons_name VARCHAR2 (30); tab_name VARCHAR2 (30); tab_name_owner VARCHAR2 (30); parent_cons VARCHAR2 (30); parent_col VARCHAR2 (30); delete1 VARCHAR (500); delete2 VARCHAR (500); delete_command VARCHAR (4000); AT_LEAST_ONE_ITERATION NUMBER DEFAULT 0; CURSOR cons_cursor IS SELECT constraint_name, r_constraint_name, table_name, constraint_type FROM all_constraints WHERE constraint_type = 'R' AND r_constraint_name IN (SELECT constraint_name FROM all_constraints WHERE constraint_type IN ('P', 'U') AND table_name = parent_table AND owner = parent_table_owner) AND delete_rule = 'NO ACTION'; CURSOR tabs_cursor IS SELECT DISTINCT table_name FROM all_cons_columns WHERE constraint_name = cons_name; CURSOR child_cols_cursor IS SELECT column_name, position FROM all_cons_columns WHERE constraint_name = cons_name AND table_name = tab_name; BEGIN FOR cons IN cons_cursor LOOP AT_LEAST_ONE_ITERATION := 1; cons_name := cons.constraint_name; parent_cons := cons.r_constraint_name; SELECT DISTINCT table_name, owner INTO tab_name, tab_name_owner FROM all_cons_columns WHERE constraint_name = cons_name; delete1 := ''; delete2 := ''; FOR col IN child_cols_cursor LOOP SELECT DISTINCT column_name INTO parent_col FROM all_cons_columns WHERE constraint_name = parent_cons AND position = col.position; IF delete1 IS NULL THEN delete1 := col.column_name; ELSE delete1 := delete1 || ', ' || col.column_name; END IF; IF delete2 IS NULL THEN delete2 := parent_col; ELSE delete2 := delete2 || ', ' || parent_col; END IF; END LOOP; delete_command := 'delete from ' || tab_name_owner || '.' || tab_name || ' where (' || delete1 || ') in (select ' || delete2 || ' from ' || parent_table_owner || '.' || parent_table || ');' || deletecascade (tab_name, tab_name_owner); INSERT INTO ris VALUES (SEQUENCE_COMANDI.NEXTVAL, delete_command); COMMIT; END LOOP; IF AT_LEAST_ONE_ITERATION = 1 THEN RETURN ' where COD_CHICKEN = V_CHICKEN AND COD_NATION = V_NATION;'; ELSE RETURN NULL; END IF; END; / Please assume that V_CHICKEN and V_NATION are the criteria to select the CHICKEN to delete from the root table: the condition is: "where COD_CHICKEN = V_CHICKEN AND COD_NATION = V_NATION" on the root table.

    Read the article

  • Part 2&ndash;Load Testing In The Cloud

    - by Tarun Arora
    Welcome to Part 2, In Part 1 we discussed the advantages of creating a Test Rig in the cloud, the Azure edge and the Test Rig Topology we want to get to. In Part 2, Let’s start by understanding the components of Azure we’ll be making use of followed by manually putting them together to create the test rig, so… let’s get down dirty start setting up the Test Rig.  What Components of Azure will I be using for building the Test Rig in the Cloud? To run the Test Agents we’ll make use of Windows Azure Compute and to enable communication between Test Controller and Test Agents we’ll make use of Windows Azure Connect.  Azure Connect The Test Controller is on premise and the Test Agents are in the cloud (How will they talk?). To enable communication between the two, we’ll make use of Windows Azure Connect. With Windows Azure Connect, you can use a simple user interface to configure IPsec protected connections between computers or virtual machines (VMs) in your organization’s network, and roles running in Windows Azure. With this you can now join Windows Azure role instances to your domain, so that you can use your existing methods for domain authentication, name resolution, or other domain-wide maintenance actions. For more details refer to an overview of Windows Azure connect. A very useful video explaining everything you wanted to know about Windows Azure connect.  Azure Compute Windows Azure compute provides developers a platform to host and manage applications in Microsoft’s data centres across the globe. A Windows Azure application is built from one or more components called ‘roles.’ Roles come in three different types: Web role, Worker role, and Virtual Machine (VM) role, we’ll be using the Worker role to set up the Test Agents. A very nice blog post discussing the difference between the 3 role types. Developers are free to use the .NET framework or other software that runs on Windows with the Worker role or Web role. Developers can also create applications using languages such as PHP and Java. More on Windows Azure Compute. Each Windows Azure compute instance represents a virtual server... Virtual Machine Size CPU Cores Memory Cost Per Hour Extra Small Shared 768 MB $0.04 Small 1 1.75 GB $0.12 Medium 2 3.50 GB $0.24 Large 4 7.00 GB $0.48 Extra Large 8 14.00 GB $0.96   You might want to review the Windows Azure Pricing FAQ. Let’s Get Started building the Test Rig… Configuration Machine Role Comments VM – 1 Domain Controller for Playpit.com On Premise VM – 2 TFS, Test Controller On Premise VM – 3 Test Agent Cloud   In this blog post I would assume that you have the domain, Team Foundation Server and Test Controller Installed and set up already. If not, please refer to the TFS 2010 Installation Guide and this walkthrough on MSDN to set up your Test Controller. You can also download a preconfigured TFS 2010 VM from Brian Keller's blog, Brian also has some great hands on Labs on TFS 2010 that you may want to explore. I. Lets start building VM – 3: The Test Agent Download the Windows Azure SDK and Tools Open Visual Studio and create a new Windows Azure Project using the Cloud Template                   Choose the Worker Role for reasons explained in the earlier post         The WorkerRole.cs implements the Run() and OnStart() methods, no code changes required. You should be able to compile the project and run it in the compute emulator (The compute emulator should have been installed as part of the Windows Azure Toolkit) on your local machine.                   We will only be making changes to WindowsAzureProject, open ServiceDefinition.csdef. Ensure that the vmsize is small (remember the cost chart above). Import the “Connect” module. I am importing the Connect module because I need to join the Worker role VM to the Playpit domain. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect"/> </Imports> </WorkerRole> </ServiceDefinition> Go to the ServiceConfiguration.Cloud.cscfg and note that settings with key ‘Microsoft.WindowsAzure.Plugins.Connect.%%%%’ have been added to the configuration file. This is because you decided to import the connect module. See the config below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration>             Let’s go step by step and understand all the highlighted parameters and where you can find the values for them.       osFamily – By default this is set to 1 (Windows Server 2008 SP2). Change this to 2 if you want the Windows Server 2008 R2 operating system. The Advantage of using osFamily = “2” is that you get Powershell 2.0 rather than Powershell 1.0. In Powershell 2.0 you could simply use “powershell -ExecutionPolicy Unrestricted ./myscript.ps1” and it will work while in Powershell 1.0 you will have to change the registry key by including the following in your command file “reg add HKLM\Software\Microsoft\PowerShell\1\ShellIds\Microsoft.PowerShell /v ExecutionPolicy /d Unrestricted /f” before you can execute any power shell. The other reason you might want to move to os2 is if you wanted IIS 7.5.       Activation Token – To enable communication between the on premise machine and the Windows Azure Worker role VM both need to have the same token. Log on to Windows Azure Management Portal, click on Connect, click on Get Activation Token, this should give you the activation token, copy the activation token to the clipboard and paste it in the configuration file. Note – Later in the blog I’ll be showing you how to install connect on the on premise machine.                       EnableDomainJoin – Set the value to true, ofcourse we want to join the on windows azure worker role VM to the domain.       DomainFQDN, DomainControllerFQDN, DomainAccountName, DomainPassword, DomainOU, Administrators – This information is specific to your domain. I have extracted this information from the ‘service manager’ and ‘Active Directory Users and Computers’. Also, i created a new Domain-OU namely ‘CloudInstances’ so all my cloud instances joined to my domain show up here, this is optional. You can encrypt the DomainPassword – refer to the instructions here. Or hold fire, I’ll be covering that when i come to certificates and encryption in the coming section.       Now once you have filled all this information up, the configuration file should look something like below, <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration> Next we will be enabling the Remote Desktop module in to the ServiceDefinition.csdef, we could make changes manually or allow a beautiful wizard to help us make changes. I prefer the second option. So right click on the Windows Azure project and choose Publish       Now once you get the publish wizard, if you haven’t already you would be asked to import your Windows Azure subscription, this is simply the Msdn subscription activation key xml. Once you have done click Next to go to the Settings page and check ‘Enable Remote Desktop for all roles’.       As soon as you do that you get another pop up asking you the details for the user that you would be logging in with (make sure you enter a reasonable expiry date, you do not want the user account to expire today). Notice the more information tag at the bottom, click that to get access to the certificate section. See screen shot below.       From the drop down select the option to create a new certificate        In the pop up window enter the friendly name for your certificate. In my case I entered ‘WAC – Test Rig’ and click ok. This will create a new certificate for you. Click on the view button to see the certificate details. Do you see the Thumbprint, this is the value that will go in the config file (very important). Now click on the Copy to File button to copy the certificate, we will need to import the certificate to the windows Azure Management portal later. So, make sure you save it a safe location.                                Click Finish and enter details of the user you would like to create with permissions for remote desktop access, once you have entered the details on the ‘Remote desktop configuration’ screen click on Ok. From the Publish Windows Azure Wizard screen press Cancel. Cancel because we don’t want to publish the role just yet and Yes because we want to save all the changes in the config file.       Now if you go to the ServiceDefinition.csdef file you will see that the RemoteAccess and RemoteForwarder roles have been imported for you. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect" /> <Import moduleName="RemoteAccess" /> <Import moduleName="RemoteForwarder" /> </Imports> </WorkerRole> </ServiceDefinition> Now go to the ServiceConfiguration.Cloud.cscfg file and you see a whole bunch for setting “Microsoft.WindowsAzure.Plugins.RemoteAccess.%%%” values added for you. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="MIIBnQYJKoZIhvcNAQcDoIIBjjCCAYoCAQAxggFOMIIBSgIBADAyMB4xHDAaBgNVBAMME1dpbmRvd 3MgQXp1cmUgVG9vbHMCEGa+B46voeO5T305N7TSG9QwDQYJKoZIhvcNAQEBBQAEggEABg4ol5Xol66Ip6QKLbAPWdmD4ae ADZ7aKj6fg4D+ATr0DXBllZHG5Umwf+84Sj2nsPeCyrg3ZDQuxrfhSbdnJwuChKV6ukXdGjX0hlowJu/4dfH4jTJC7sBWS AKaEFU7CxvqYEAL1Hf9VPL5fW6HZVmq1z+qmm4ecGKSTOJ20Fptb463wcXgR8CWGa+1w9xqJ7UmmfGeGeCHQ4QGW0IDSBU6ccg vzF2ug8/FY60K1vrWaCYOhKkxD3YBs8U9X/kOB0yQm2Git0d5tFlIPCBT2AC57bgsAYncXfHvPesI0qs7VZyghk8LVa9g5IqaM Cp6cQ7rmY/dLsKBMkDcdBHuCTAzBgkqhkiG9w0BBwEwFAYIKoZIhvcNAwcECDRVifSXbA43gBApNrp40L1VTVZ1iGag+3O1" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2012-11-27T23:59:59.0000000+00:00" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" /> </ConfigurationSettings> <Certificates> <Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="AA23016CF0BDFC344400B5B82706B608B92E4217" thumbprintAlgorithm="sha1" /> </Certificates> </Role> </ServiceConfiguration>          Okay let’s look at them one at a time,       Enabled - Yes, we would like to enable Remote Access.       AccountUserName – This is the user name you entered while you were on the publish windows azure role screen, as detailed above.       AccountEncrytedPassword – Try and decode that, the certificate is used to encrypt the password you specified for the user account. Remember earlier i said, either use the instructions or wait and i’ll be showing you encryption, now the user account i am using for rdp has the same password as my domain password, so i can simply copy the value of the AccountEncryptedPassword to the DomainPassword as well.       AccountExpiration – This is the expiration as you specified in the wizard earlier, make sure your account does not expire today.       Remote Forwarder – Check out the documentation, below is how I understand it, -- One role in an application that implements a remote desktop connection must import the RemoteForwarder module. The two modules work together to enable the remote desktop connections to role instances. -- If you have multiple roles defined in the service model, it does not matter which role you add the RemoteForwarder module to, but you must add it to only one of the role definitions.       Certificate – Remember the certificate thumbprint from the wizard, the on premise machine and windows azure role machine that need to speak to each other must have the same thumbprint. More on that when we install Windows Azure connect Endpoints on the on premise machine. As i said earlier, in this blog post, I’ll be showing you the manual process so i won’t be scripting any star up tasks to install the test agent or register the test agent with the TFS Server. I’ll be showing you all this cool stuff in the next blog post, that’s because it’s important to understand the manual side of it, it becomes easier for you to troubleshoot in case something fails. Having said that, the changes we have made are sufficient to spin up the Windows Azure Worker Role aka Test Agent VM, have it connected with the play.pit.com domain and have remote access enabled on it. Before we deploy the Test Agent VM we need to set up Windows Azure Connect on the TFS Server. II. Windows Azure Connect: Setting up Connect on VM – 2 i.e. TFS & Test Controller Glad you made it so far, now to enable communication between the on premise TFS/Test Controller and Azure-ed Test Agent we need to enable communication. We have set up the Azure connect module in the Test Agent configuration, now the connect end points need to be enabled on the on premise machines, let’s have a look at how we can do this. Log on to VM – 2 running the TFS Server and Test Controller Log on to the Windows Azure Management Portal and click on Virtual Network Click on Virtual Network, if you already have a subscription you should see the below screen shot, if not, you would be asked to complete the subscription first        Click on Install Local Endpoints from the top left on the panel and you get a url appended with a token id in it, remember the token i showed you earlier, in theory the token you get here should match the token you added to the Test Agent config file.        Copy the url to the clip board and paste it in IE explorer (important, the installation at present only works out of IE and you need to have cookies enabled in order to complete the installation). As stated in the pop up, you can NOT download and run the software later, you need to run it as is, since it contains a token. Once the installation completes you should see the Windows Azure connect icon in the system tray.                         Right click the Azure Connect icon, choose Diagnostics and refer to this link for diagnostic detail terminology. NOTE – Unfortunately I could not see the Windows Azure connect icon in the system tray, a bit of binging with Google revealed that the azure connect icon is only shown when the ‘Windows Azure Connect Endpoint’ Service is started. So go to services.msc and make sure that the service is started, if not start it, unfortunately again, the service did not start for me on a manual start and i realised that one of the dependant services was disabled, you can look at the service dependencies and start them and then start windows azure connect. Bottom line, you need to start Windows Azure connect service before you can proceed. Please refer here on MSDN for more on Troubleshooting Windows Azure connect. (Follow the next step as well)   Now go back to the Windows Azure Management Portal and from Groups and Roles create a new group, lets call it ‘Test Rig’. Make sure you add the VM – 2 (the TFS Server VM where you just installed the endpoint).       Now if you go back to the Azure Connect icon in the system tray and click ‘Refresh Policy’ you will notice that the disconnected status of the icon should change to ready for connection. III. Importing Certificate in to Windows Azure Management Portal But before that you need to import the certificate you created in Step I in to the Windows Azure Management Portal. Log on to the Windows Azure Management Portal and click on ‘Hosted Services, Storage Accounts & CDN’ and then ‘Management Certificates’ followed by Add Certificates as shown in the screen shot below        Browse to the location where you saved the certificate earlier, remember… Refer to Step I in case you forgot.        Now you should be able to see the imported certificate here, make sure the thumbprint of the certificate matches the one you inserted in the config files        IV. Publish Windows Azure Worker Role aka Test Agent Having completed I, II and III, you are ready to publish the Test Agent VM – 3 to the cloud. Go to Visual Studio and right click the Windows Azure project and select Publish. Verify the infomration in the wizard, from the advanced settings tab, you can also enabled capture of intellitrace or profiling information.         Click Next and Click Publish! From the view menu bar select the Windows Azure Activity Log window.       Now you should be able to see the deployment progress in real time.             In the Windows Azure Management Portal, you should also be able to see the progress of creation of a new Worker Role.       Once the deployment is complete you should be able to RDP (go to run prompt type mstsc and in the pop up the machine name) in to the Test Agent Worker Role VM from the Playpit network using the domain admin user account. In case you are unable to log in to the Test Agent using the domain admin user account it means the process of joining the Test Agent to the domain has failed! But the good news is, because you imported the connect module, you can connect to the Test Agent machine using Windows Azure Management Portal and troubleshoot the reason for failure, you will be able to log in with the user name and password you specified in the config file for the keys ‘RemoteAccess.AccountUsername, RemoteAccess.EncryptedPassword (just that enter the password unencrypted)’, fix it or manually join the machine to the domain. Once you have managed to Join the Test Agent VM to the Domain move to the next step.      So, log in to the Test Agent Worker Role VM with the Playpit Domain Administrator and verify that you can log in, the machine is connected to the domain and the connect service is successfully running. If yes, give your self a pat on the back, you are 80% mission accomplished!         Go to the Windows Azure Management Portal and click on Virtual Network, click on Groups and Roles and click on Test Rig, click Edit Group, the edit the Test Rig group you created earlier. In the Connect to section, click on Add to select the worker role you have just deployed. Also, check the ‘Allow connections between endpoints in the group’ with this you will enable to communication between test controller and test agents and test agents/test agents. Click Save.      Now, you are ready to deploy the Test Agent software on the Worker Role Test Agent VM and configure it to work with the Test Controller. V. Configuring VM – 3: Installing Test Agent and Associating Test Agent to Controller Log in to the Worker Role Test Agent VM that you have just successfully deployed, make sure you log in with the domain administrator account. Download the All Agents software from MSDN, ‘en_visual_studio_agents_2010_x86_x64_dvd_509679.iso’, extract the iso and navigate to where you have extracted the iso. In my case, i have extracted the iso to “C:\Resources\Temp\VsAgentSetup”. Open the Test Agent folder and double click on setup.exe. Once you have installed the Test Agent you should reach the configuration window. If you face any issues installing TFS Test Agent on the VM, refer to the walkthrough on MSDN.       Once you have successfully installed the Test Agent software you will need to configure the test agent. Right click the test agent configuration tool and run as a different user. i.e. an Administrator. This is really to run the configuration wizard with elevated privileges (you might have UAC block something's otherwise).        In the run options, you can select ‘service’ you do not need to run the agent as interactive un less you are running coded UI tests. I have specified the domain administrator to connect to the TFS Test Controller. In real life, i would never do that, i would create a separate test user service account for this purpose. But for the blog post, we are using the most powerful user so that any policies or restrictions don’t block you.        Click the Apply Settings button and you should be all green! If not, the summary usually gives helpful error messages that you can resolve and proceed. As per my experience, you may run in to either a permission or a firewall blocking communication issue.        And now the moment of truth! Go to VM –2 open up Visual Studio and from the Test Menu select Manage Test Controller       Mission Accomplished! You should be able to see the Test Agent that you have just configured here,         VI. Creating and Running Load Tests on your brand new Azure-ed Test Rig I have various blog posts on Performance Testing with Visual Studio Ultimate, you can follow the links and videos below, Blog Posts: - Part 1 – Performance Testing using Visual Studio 2010 Ultimate - Part 2 – Performance Testing using Visual Studio 2010 Ultimate - Part 3 – Performance Testing using Visual Studio 2010 Ultimate Videos: - Test Tools Configuration & Settings in Visual Studio - Why & How to Record Web Performance Tests in Visual Studio Ultimate - Goal Driven Load Testing using Visual Studio Ultimate Now that you have created your load tests, there is one last change you need to make before you can run the tests on your Azure Test Rig, create a new Test settings file, and change the Test Execution method to ‘Remote Execution’ and select the test controller you have configured the Worker Role Test Agent against in our case VM – 2 So, go on, fire off a test run and see the results of the test being executed on the Azur-ed Test Rig. Review and What’s next? A quick recap of the benefits of running the Test Rig in the cloud and what i will be covering in the next blog post AND I would love to hear your feedback! Advantages Utilizing the power of Azure compute to run a heavy virtual user load. Benefiting from the Azure flexibility, destroy Test Agents when not in use, takes < 25 minutes to spin up a new Test Agent. Most important test Network Latency, (network latency and speed of connection are two different things – usually network latency is very hard to test), by placing the Test Agents in Microsoft Data centres around the globe, one can actually test the lag in transferring the bytes not because of a slow connection but because the page has been requested from the other side of the globe. Next Steps The process of spinning up the Test Agents in windows Azure is not 100% automated. I am working on the Worker process and power shell scripts to make the role deployment, unattended install of test agent software and registration of the test agent to the test controller automated. In the next blog post I will show you how to make the complete process unattended and automated. Remember to subscribe to http://feeds.feedburner.com/TarunArora. Hope you enjoyed this post, I would love to hear your feedback! If you have any recommendations on things that I should consider or any questions or feedback, feel free to leave a comment. See you in Part III.   Share this post : CodeProject

    Read the article

  • Is this right in the use case of exec method of child_process? is there away to cody the envirorment along with the require module too?

    - by L2L2L
    I'm learning node. I am using child_process to move data to another script to be executed. But it seem that it does not copy the hold environment or I could be doing something wrong. To copy the hold environment --require modules too-- or is this when I use spawn, I'm not so clear or understanding spawn exec and execfile --although execfile is like what I'm doing at the bottom, but with exec... right?-- And I would just love to have some clarity on this matter. Please anyone? Thank you. parent.js - "use strict"; var fs, path, _err; fs = require("fs"), path = require("path"), _err = require("./err.js"); var url; url= process.argv[1]; var dirname, locate_r; dirname = path.dirname(url); locate_r = dirname + "/" + "test.json";//path.join(dirname,"/", "test.json"); var flag, str; flag = "r", str = ""; fs.open(locate_r, flag, function opd(error, fd){ if (error){_err(error, function(){ fs.close(fd,function(){ process.stderr.write("\n" + "In Finally Block: File Closed!" + "\n");});})} var readBuff, buffOffset, buffLength, filePos; readBuff = new Buffer(15), buffOffset = 0, buffLength = readBuff.length, filePos = 0; fs.read(fd, readBuff, buffOffset, buffLength, filePos, function rd(error, readBytes){ error&&_err(error, fd); str = readBuff.toString("utf8"); process.env.str = str; process.stdout.write("str: "+ str + "\n" + "readBuff: " + readBuff + "\n"); fs.close(fd, function(){process.stdout.write( "Read and Closed File." + "\n" )}); //write(str); //run test for process.exec** var env, varName, envCopy, exec; env = process.env, varName, envCopy = {}, exec = require("child_process").exec; for(varName in env){ envCopy[varName] = env[varName]; } process.env.fs = fs, process.env.path = path, process.env.dirname = dirname, process.env.flag = flag, process.env.str = str, process.env._err = _err; process.env.fd = fd; exec("node child.js", env, function(error, stdout, stderr){ if(error){throw (new Error(error));} }); }); }); child.js - "use strict"; var fs, path, _err; fs = require("fs"), path = require("path"), _err = require("./err.js"); var fd, fs, flag, path, dirname, str, _err; fd = process.env.fd, //fs = process.env.fs, //path = process.env.path, dirname = process.env.dirname, flag = process.env.flag, str = process.env.str, _err = process.env._err; var url; url= process.argv[1]; var locate_r; dirname = path.dirname(url); locate_r = dirname + "/" + "test.json";//path.join(dirname,"/", "test.json"); //function write(str){ var locate_a; locate_a = dirname + "/" + "test.json"; //path.join(dirname,"/", "test.json"); flag = "a"; fs.open(locate_a, flag, function opd(error, fd){ error&&_err(error, fs, fd); var writeBuff, buffPos, buffLgh, filePs; writeBuff = new Buffer(str), process.stdout.write( "writeBuff: " + writeBuff + "\n" + "str: " + str + "\n"), buffPos = 0, buffLgh = writeBuff.length, filePs = buffLgh;//null; fs.write(fd, writeBuff, buffPos, buffLgh, filePs-3, function(error, written){ error&&_err(error, function(){ fs.close(fd,function(){ process.stderr.write("\n" + "In Finally Block: File Closed!" + "\n"); }); }); fs.close(fd, function(){process.stdout.write( "Written and Closed File." + "\n");}); }); }); //} err.js - "use strict"; var fs; fs = require("fs"); module.exports = function _err(err, scp, cd){ try{ throw (new Error(err)); }catch(e){ process.stderr.write(e + "\n"); }finally{ cd; } }

    Read the article

  • IOUG and Oracle Enterprise Manager User Community Twitter Chat and Sessions at OpenWorld

    - by Anand Akela
    Like last many years, we will have annual Oracle Users Forum on Sunday, September 30th, 2012 at Moscone West, Levels 2 & 3 . It will be open to all registered attendees of Oracle Open World and conferences running from September 29 to October 5, 2012 . This will be a great  opportunity to meet with colleagues, peers, and subject matter experts to share best practices, tips, and techniques around Oracle technologies. You could sit in on a special interest group (SIG) meeting or session and learn how to get more out of Oracle technologies and applications. IOUG and Oracle Enterprise Manager team invites you to join a Twitter Chat on Sunday, Sep. 30th from 11:30 AM to 12:30 PM.  IOUG leaders, Enterprise Manager SIG contributors and many Oracle Users Forum speakers will answer questions related to their experience with Oracle Enterprise Manager and the activities and resources available for  Enterprise Manager SIG members. You can participate in the chat using hash tag #em12c on Twitter.com or by going to  tweetchat.com/room/em12c      (Needs Twitter credential for participating).  Feel free to join IOUG and Enterprise team members at the User Group Pavilion on 2nd Floor, Moscone West. Here is the complete list of Oracle Enterprise Manager sessions during the Oracle Users Forum : Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Time Session Title Speakers Location 8:00AM - 8:45AM UGF4569 - Oracle RAC Migration with Oracle Automatic Storage Management and Oracle Enterprise Manager 12c VINOD Emmanuel -Database Engineering, Dell, Inc. Wendy Chen - Sr. Systems Engineer, Dell, Inc. Moscone West - 2011 8:00AM - 8:45AM UGF10389 -  Monitoring Storage Systems for Oracle Enterprise Manager 12c Anand Ranganathan - Product Manager, NetApp Moscone West - 2016 9:00AM - 10:00AM UGF2571 - Make Oracle Enterprise Manager Sing and Dance with the Command-Line Interface Ray Smith - Senior Database Administrator, Portland General Electric Moscone West - 2011 10:30AM - 11:30AM UGF2850 - Optimal Support: Oracle Enterprise Manager 12c Cloud Control, My Oracle Support, and More April Sims - DBA, Southern Utah University Moscone West - 2011 11:30AM - 12:30PM IOUG and Oracle Enterprise Manager Joint Tweet Chat  Join IOUG Leaders, IOUG's Enterprise Manager SIG Contributors and Speakers on Twitter and ask questions related to practitioner's experience with Oracle Enterprise Manager and the new IOUG 's Enterprise Manager SIG. To attend and participate in the chat, please use hash tag #em12c on twitter.com or your favorite Twitter client. You can also go to tweetchat.com/room/em12c to watch the conversation or login with your twitter credentials to ask questions. User Group Pavilion 2nd Floor, Moscone West 12:30PM-2:00PM UGF5131 - Migrating from Oracle Enterprise Manager 10g Grid Control to 12c Cloud Control    Leighton Nelson - Database Administrator, Mercy Moscone West - 2011 2:15PM-3:15PM UGF6511 -  Database Performance Tuning: Get the Best out of Oracle Enterprise Manager 12c Cloud Control Mike Ault - Oracle Guru, TEXAS MEMORY SYSTEMS INC Tariq Farooq - CEO/Founder, BrainSurface Moscone West - 2011 3:30PM-4:30PM UGF4556 - Will It Blend? Verifying Capacity in Server and Database Consolidations Jeremiah Wilton - Database Technology, Blue Gecko / DatAvail Moscone West - 2018 3:30PM-4:30PM UGF10400 - Oracle Enterprise Manager 12c: Monitoring, Metric Extensions, and Configuration Best Practices Kellyn Pot'Vin - Sr. Technical Consultant, Enkitec Moscone West - 2011 Stay Connected: Twitter |  Face book |  You Tube |  Linked in |  Newsletter

    Read the article

  • Identity R2 - Experts Podcast Series

    - by Tanu Sood
    To follow up on the Identity Management R2 launch, a series of podcasts were recorded with subject matter experts from customer organizations, our partners and Oracle’s PM team to discuss key trends, R2 capabilities, implementation best practices and more. Below is a roll-up of the podcast series that is available on Fusion Middleware radio. R2 Podcasts:   ·         Designing the Next-Generation Identity Platform Vadim Lander, Oracle Highlights: Common architecture model, integration, interoperability and the driving factors behind R2 innovation IT Departments are shifting their Identity Management strategy to be able to support mobile, cloud and social applications. Oracle has anticipated this shift and has built a product roadmap to take advantage of this focus. Join Vadim as he discusses the design strategy behind the latest 11gR2 release and talks about how IDM services have to evolve to meet this new challenge.   ·         BETA Customer Perspective on R2 Ravi Meduri, Kaiser Permanente Highlights: R2 scalability and high availability In this podcast Ravi discusses the new features in 11gR2 that he is most interested in, including High Availability options for Access Management, multi-datacenter architecture, and what it was like working with the Oracle product team during the BETA program.   ·         Partner Perspective on R2 Rex Thexton, PricewaterhouseCoopers Highlights: Usability Enhancements for Users and Administrators A lot of new usability features went into the 11gR2 release making this the most business friendly IDM release to date. In this podcast Rex Thexton, Managing Director from PwC, talks about some of the new UI changes for both end users and administrators, and also about the new connector creation framework.   Access Request Updates in R2 Marc Boroditsky, Oracle Highlights: Access request User Interface innovations A lot of changes have been made to the Access Request user interface in the latest version of Oracle Identity Manager 11gR2. A real focus has been put on making the request process more business user friendly, and a lot of new customization capability has been added for the IT administrators. Hear Marc discuss the updated UI, and explain how administrators will be able to customize OIM to meet their company's requirements   ·         Oracle Optimized System for Oracle Unified Directory (OOS4OUD) Nick Kloski, Oracle Highlights: New Optimized System configuration for Unified Directory One of the new features in 11gR2 is the availability of an Optimized System configuration for Oracle Unified Directory. Oracle engineers installed the OUD software onto off the shelf hardware and then created a performance tuned configuration. Join us as we talk to Nick Kloski, Infrastructure Solutions Manager, all about the testing process and the resulting performance metrics.   Privileged Account Management Mark Wilcox, Oracle Highlights: Oracle Privileged Account Manager key capabilities, use cases The new release of Oracle Identity Management 11g R2 includes the capability to manage privileged accounts. Privileged accounts, if compromised, create a risk for fraud in the enterprise and as a result controlling access to privileged accounts is critical. Hear what Mark Wilcox, Principal Product Manager of Oracle Privileged Account Manager has to say about the capabilities of the offering in this podcast.   ·         Browser-based User Interface (UI) Customization Clayton Donley, Oracle Highlights: Benefits of Durable UI Configuration framework Business users need user interfaces that are not only friendly but also easily customizable. However the downside of any customization project is the cost and complexity involved in developing, testing, deploying and managing custom code. In this podcast, we examine how a new capability in Oracle Identity Management around browser based UI customization can reduce costs and complexity of customization while simplifying self service integration with corporate portal strategies.   ·         Simplifying Mobile and Social Sign-On Dan Killmer, Oracle Highlights: Secure mobile sign-on and consumption of social identities with Oracle Access Management The proliferation of mobile devices has spurred a new trend where employees tend to bring their own mobile devices to work and access corporate applications the same way they would access from a desktop or laptop. In this podcast, we examine how Oracle's latest innovation in Identity Management around Mobile and Social Sign On can simplify security and access management challenges posed by the widespread adoption of mobile devices in the enterprise. ·         Enabling Your Business with IDM R2 Scott Bonnell, Oracle Highlights: Self service, mobile access, personalization Gone are the days when Identity Management was just about stopping unauthorized users in their tracks. Identity Management if done right, can also enable your business. Join Scott Bonnell as he discusses how the IDM 11gR2 release enables the enterprise by providing self service, personalization and mobile access to corporate resources.

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >