Search Results

Search found 61393 results on 2456 pages for 'data integration'.

Page 541/2456 | < Previous Page | 537 538 539 540 541 542 543 544 545 546 547 548  | Next Page >

  • Oracle ODP.NET und Windows PowerShell

    - by cjandaus
    In der Microsoft Welt wohlbekannt, in der Oracle Welt nur ein Schulterzucken hervorrufend - die sogenannten Scripting Guys. Wie der Name bereits vermuten lässt, geht es in deren Hey, Scripting Guy! Blog um Scripting. Und damit natürlich um die Windows PowerShell. Ja, die Zeiten des DOS-Kommandofensters und Batch-Dateien ist vorbei. Die PowerShell ist eine mächtige Scripting-Umgebung unter Windows, die selbst unter Unix/Linux-Administratoren Gefallen finden sollte. Dass man damit wunderbar auch auf Oracle Datenbanken zugreifen kann, haben wir bereits vor Jahren in einer Oracle Workshop Reihe bewiesen. Damals begleitete mich Klaus Rohe von Microsoft, der mit mir dann auch gemeinsam einen Vortrag auf DOAG Konferenz hielt. Unser gemeinsames Ziel war es damals wie heute, die Oracle Anwender von der hervorragenden Integration zwischen Oracle, Windows und .NET zu überzeugen. Was lag näher, als sich dies von beiden Herstellern gemeinsam bestätigen zu lassen? Vor allem die ewigen Zweifler begrüßten dies. Seither war die PowerShell bei mir nicht mehr auf dem Radar und auch Oracle Anwender haben das Thema nicht mehr aufgeworfen. Möglicherweise auch deshalb, weil es zu neu oder zu unbekannt ist? Eher unwahrscheinlich ... Vielleicht liegt es vielmehr daran, dass man einfach mal davon ausgeht, dass PowerShell nur für Microsoft Produkte richtig nutzbar ist? Oder man bekommt erzählt, dass nur die Integration mit der Microsoft-eigenen Datenbank SQL Server möglich ist? Und das ist natürlich nicht richtig - so wie immer (ich denke dabei unter anderem an das Microsoft Active Directory - aber dazu ein andermal mehr). Umso mehr freut es mich, einen brandneuen Blog-Beitrag zu genau diesem Thema zu lesen, auf den mich Alex Keh, (Produkt Manager für Windows und .NET im Oracle Headquarter in San Francisco) aufmerksam gemacht hat. Was die Sache noch besser macht, dieser Beitrag stammt aus der Microsoft Welt und belegt damit zwischen den Zeilen, dass die Oracle Datenbank und unsere .NET Integration via dem Oracle Data Provider for .NET (ODP.NET) auch hier eine bedeutende Rolle spielt. In diesem Sinne: Beide Daumen hoch für die Scripting Guys! Der Beitrag nennt sich Use Oracle ODP.NET and PowerShell to Simplify Data Access und trotz ein paar weniger Ausreißer, ist der Artikel sehr zu empfehlen, um in das Thema einzusteigen. Lassen Sie es mich wissen, wie Sie zu dieser Integration stehen, ob die PowerShell für Sie in der Praxis wichtig ist oder werden könnte, und falls Sie Features vermissen, die Oracle künftig umsetzen sollte. Danke!

    Read the article

  • A few questions about integrating AudioKinetic Wwise and Unity

    - by SaldaVonSchwartz
    I'm new to Wwise and to using it with Unity, and though I have gotten the integration to work, I'm still dealing with some loose ends and have a few questions: (I'm on Unity 4.3 as of now but I think it shouldn't make any difference) The base path: The Wwise documentation implies you set this in the AkGlobalSoundEngineInitializer basePath public ivar, which is exposed to the editor. However, I found that this variable is not really used. Instead, the path is hardcoded to /Audio/GeneratedSoundBanks in AkBankPath. I had to modify both scripts to actually look in the path that I set in the editor property. What's the deal with this? Just sloppyness or am I missing something? Also about paths: since I'm on Mac, I'm using Unity natively under OS X and in tadem, the Wwise authoring tool via VMWare and I share the OS X Unity project folder so I can generate the soundbanks into the assets folder. However, the authoring tool (downloaded the latest one for Windows) doesn't automatically generate any "platform-specific" subfolders for my wwise files. That is, again, the Unity integration scripts assume the path to be /Audio/GeneratedSoundBanks/<my-platform>/ which in my case would be Mac (I set the authoring tool to generate for Mac). The documentation says wwise will automatically generate the platform-specific folders but it just dumps all the stuff in GeneratedSoundBanks. Am I missing some setting? cause right now I just manually create the /Mac folder. The C# methods AkSoundEngine.PostEvent and AkSoundEngine.LoadBank for instance, have a few overloads, including ones where I can refer to the soundbanks or events by their ID. However, if I try to use these, for instance: AkSoundEngine.LoadBank(, AkSoundEngine.AK_DEFAULT_POOL_ID) where the int I got from the .h header, I get Ak_Fail. If I use the overloads that reference the objects by string name then it works. What gives? Converting the ID header to C#: The integration comes with a C# script that seems to fork a process to call Python in turn to covert the C++ header into a C# script. This always fails unless I manually execute the Python script myself from outside Unity. Might be a permissions thing, but has anyone experienced this? The Profiler: I set up the Unity player to run in the background and am using the "Profile" version of the plugin. However, when I start the Unity OS X standalone app, the profiler in VMWare does not see it. This I'm thinking might just be that I'm trying to see a running instance of the sound engine inside an OS X binary from a Windows virtual machine. But I'm just wondering if anyone has gotten the Windows profiler to see an OS X Unity binary. Different versions of the integration plugin: It's not clear to me from the documentation whether I have to manually (or write a script to do it) remove the "Profile" version and install the "Release" version when I'm going to do a Release build or if I should install both version in Unity and it'll select the right one. Thanks!

    Read the article

  • Curing the Database-Application mismatch

    - by Phil Factor
    If an application requires access to a database, then you have to be able to deploy it so as to be version-compatible with the database, in phase. If you can deploy both together, then the application and database must normally be deployed at the same version in which they, together, passed integration and functional testing.  When a single database supports more than one application, then the problem gets more interesting. I’ll need to be more precise here. It is actually the application-interface definition of the database that needs to be in a compatible ‘version’.  Most databases that get into production have no separate application-interface; in other words they are ‘close-coupled’.  For this vast majority, the whole database is the application-interface, and applications are free to wander through the bowels of the database scot-free.  If you’ve spurned the perceived wisdom of application architects to have a defined application-interface within the database that is based on views and stored procedures, any version-mismatch will be as sensitive as a kitten.  A team that creates an application that makes direct access to base tables in a database will have to put a lot of energy into keeping Database and Application in sync, to say nothing of having to tackle issues such as security and audit. It is not the obvious route to development nirvana. I’ve been in countless tense meetings with application developers who initially bridle instinctively at the apparent restrictions of being ‘banned’ from the base tables or routines of a database.  There is no good technical reason for needing that sort of access that I’ve ever come across.  Everything that the application wants can be delivered via a set of views and procedures, and with far less pain for all concerned: This is the application-interface.  If more than zero developers are creating a database-driven application, then the project will benefit from the loose-coupling that an application interface brings. What is important here is that the database development role is separated from the application development role, even if it is the same developer performing both roles. The idea of an application-interface with a database is as old as I can remember. The big corporate or government databases generally supported several applications, and there was little option. When a new application wanted access to an existing corporate database, the developers, and myself as technical architect, would have to meet with hatchet-faced DBAs and production staff to work out an interface. Sure, they would talk up the effort involved for budgetary reasons, but it was routine work, because it decoupled the database from its supporting applications. We’d be given our own stored procedures. One of them, I still remember, had ninety-two parameters. All database access was encapsulated in one application-module. If you have a stable defined application-interface with the database (Yes, one for each application usually) you need to keep the external definitions of the components of this interface in version control, linked with the application source,  and carefully track and negotiate any changes between database developers and application developers.  Essentially, the application development team owns the interface definition, and the onus is on the Database developers to implement it and maintain it, in conformance.  Internally, the database can then make all sorts of changes and refactoring, as long as source control is maintained.  If the application interface passes all the comprehensive integration and functional tests for the particular version they were designed for, nothing is broken. Your performance-testing can ‘hang’ on the same interface, since databases are judged on the performance of the application, not an ‘internal’ database process. The database developers have responsibility for maintaining the application-interface, but not its definition,  as they refactor the database. This is easily tested on a daily basis since the tests are normally automated. In this setting, the deployment can proceed if the more stable application-interface, rather than the continuously-changing database, passes all tests for the version of the application. Normally, if all goes well, a database with a well-designed application interface can evolve gracefully without changing the external appearance of the interface, and this is confirmed by integration tests that check the interface, and which hopefully don’t need to be altered at all often.  If the application is rapidly changing its ‘domain model’  in the light of an increased understanding of the application domain, then it can change the interface definitions and the database developers need only implement the interface rather than refactor the underlying database.  The test team will also have to redo the functional and integration tests which are, of course ‘written to’ the definition.  The Database developers will find it easier if these tests are done before their re-wiring  job to implement the new interface. If, at the other extreme, an application receives no further development work but survives unchanged, the database can continue to change and develop to keep pace with the requirements of the other applications it supports, and needs only to take care that the application interface is never broken. Testing is easy since your automated scripts to test the interface do not need to change. The database developers will, of course, maintain their own source control for the database, and will be likely to maintain versions for all major releases. However, this will not need to be shared with the applications that the database servers. On the other hand, the definition of the application interfaces should be within the application source. Changes in it have to be subject to change-control procedures, as they will require a chain of tests. Once you allow, instead of an application-interface, an intimate relationship between application and database, we are in the realms of impedance mismatch, over and above the obvious security problems.  Part of this impedance problem is a difference in development practices. Whereas the application has to be regularly built and integrated, this isn’t necessarily the case with the database.  An RDBMS is inherently multi-user and self-integrating. If the developers work together on the database, then a subsequent integration of the database on a staging server doesn’t often bring nasty surprises. A separate database-integration process is only needed if the database is deliberately built in a way that mimics the application development process, but which hampers the normal database-development techniques.  This process is like demanding a official walking with a red flag in front of a motor car.  In order to closely coordinate databases with applications, entire databases have to be ‘versioned’, so that an application version can be matched with a database version to produce a working build without errors.  There is no natural process to ‘version’ databases.  Each development project will have to define a system for maintaining the version level. A curious paradox occurs in development when there is no formal application-interface. When the strains and cracks happen, the extra meetings, bureaucracy, and activity required to maintain accurate deployments looks to IT management like work. They see activity, and it looks good. Work means progress.  Management then smile on the design choices made. In IT, good design work doesn’t necessarily look good, and vice versa.

    Read the article

  • Oracle's Sun ZFS Storage Appliances and Oracle VM

    - by uwes
    Oracle's Sun ZFS Storage Appliance Is the Optimal Platform for Deploying Consolidated Applications in an Oracle Virtual Machine (OVM) Environment Unsurpassed Integration - Oracle VM and Storage Engineering teams provide seamless integration points and an Oracle VM Connect Plug-In for Sun ZFS Storage Appliance in FC, NFS, and iSCSI Environments.  And Sun ZFS Storage is engineered and tested to work with Oracle VM agility features including Live (VM) Migration and oracle RAC Live Migration. More information could befound under the following links: ZFS Storage Appliance Server Virtualization Oracle.com page ZFS Storage Appliance Oracle.com page ZFS Storage Appliance Oracle Technical Network.com page Software download support.oracle.com page

    Read the article

  • Web Designer and/or Developer

    - by chimps
    we've outsourced our app development, the dev's have created a DB hosted on Amazon-EC2. we're in talks with a web designer for website but the designer does not do any backend integration. i.e connect the website with DB created by app developers do you recommend getting designs from the designer and getting a freelancer to do the front-back end integration, I mean would there be issues/complications? Or go with designer who provides the complete package?

    Read the article

  • Google I/O 2012 - Managing Google Compute Engine Virtual Machines Through Google App Engine

    Google I/O 2012 - Managing Google Compute Engine Virtual Machines Through Google App Engine Alon Levi, Adam Eijdenberg Google Compute Engine provides highly efficient and scalable virtual machines for large scale data processing operations. Integration with Google App Engine provides an orchestration framework to manage large virtual machine clusters used for data processing. This session will talk demonstrate integration and discuss future use cases of the two technologies. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 0 0 ratings Time: 51:06 More in Science & Technology

    Read the article

  • Deterministic/Consistent Unique Masking

    - by Dinesh Rajasekharan-Oracle
    One of the key requirements while masking data in large databases or multi database environment is to consistently mask some columns, i.e. for a given input the output should always be the same. At the same time the masked output should not be predictable. Deterministic masking also eliminates the need to spend enormous amount of time spent in identifying data relationships, i.e. parent and child relationships among columns defined in the application tables. In this blog post I will explain different ways of consistently masking the data across databases using Oracle Data Masking and Subsetting The readers of post should have minimal knowledge on Oracle Enterprise Manager 12c, Application Data Modeling, Data Masking concepts. For more information on these concepts, please refer to Oracle Data Masking and Subsetting document Oracle Data Masking and Subsetting 12c provides four methods using which users can consistently yet irreversibly mask their inputs. 1. Substitute 2. SQL Expression 3. Encrypt 4. User Defined Function SUBSTITUTE The substitute masking format replaces the original value with a value from a pre-created database table. As the method uses a hash based algorithm in the back end the mappings are consistent. For example consider DEPARTMENT_ID in EMPLOYEES table is replaced with FAKE_DEPARTMENT_ID from FAKE_TABLE. The substitute masking transformation that all occurrences of DEPARTMENT_ID say ‘101’ will be replaced with ‘502’ provided same substitution table and column is used , i.e. FAKE_TABLE.FAKE_DEPARTMENT_ID. The following screen shot shows the usage of the Substitute masking format with in a masking definition: Note that the uniqueness of the masked value depends on the number of columns being used in the substitution table i.e. if the original table contains 50000 unique values, then for the masked output to be unique and deterministic the substitution column should also contain 50000 unique values without which only consistency is maintained but not uniqueness. SQL EXPRESSION SQL Expression replaces an existing value with the output of a specified SQL Expression. For example while masking an EMPLOYEES table the EMAIL_ID of an employee has to be in the format EMPLOYEE’s [email protected] while FIRST_NAME and LAST_NAME are the actual column names of the EMPLOYEES table then the corresponding SQL Expression will look like %FIRST_NAME%||’.’||%LAST_NAME%||’@COMPANY.COM’. The advantage of this technique is that if you are masking FIRST_NAME and LAST_NAME of the EMPLOYEES table than the corresponding EMAIL ID will be replaced accordingly by the masking scripts. One of the interesting aspect’s of a SQL Expressions is that you can use sub SQL expressions, which means that you can write a nested SQL and use it as SQL Expression to address a complex masking business use cases. SQL Expression can also be used to consistently replace value with hashed value using Oracle’s PL/SQL function ORA_HASH. The following SQL Expression will help in the previous example for replacing the DEPARTMENT_IDs with a hashed number ORA_HASH (%DEPARTMENT_ID%, 1000) The following screen shot shows the usage of encrypt masking format with in the masking definition: ORA_HASH takes three arguments: 1. Expression which can be of any data type except LONG, LOB, User Defined Type [nested table type is allowed]. In the above example I used the Original value as expression. 2. Number of hash buckets which can be number between 0 and 4294967295. The default value is 4294967295. You can also co-relate the number of hash buckets to a range of numbers. In the above example above the bucket value is specified as 1000, so the end result will be a hashed number in between 0 and 1000. 3. Seed, can be any number which decides the consistency, i.e. for a given seed value the output will always be same. The default seed is 0. In the above SQL Expression a seed in not specified, so it to 0. If you have to use a non default seed then the function will look like. ORA_HASH (%DEPARTMENT_ID%, 1000, 1234 The uniqueness depends on the input and the number of hash buckets used. However as ORA_HASH uses a 32 bit algorithm, considering birthday paradox or pigeonhole principle there is a 0.5 probability of collision after 232-1 unique values. ENCRYPT Encrypt masking format uses a blend of 3DES encryption algorithm, hashing, and regular expression to produce a deterministic and unique masked output. The format of the masked output corresponds to the specified regular expression. As this technique uses a key [string] to encrypt the data, the same string can be used to decrypt the data. The key also acts as seed to maintain consistent outputs for a given input. The following screen shot shows the usage of encrypt masking format with in the masking definition: Regular Expressions may look complex for the first time users but you will soon realize that it’s a simple language. There are many resources in internet, oracle documentation, oracle learning library, my oracle support on writing a Regular Expressions, out of all the following My Oracle Support document helped me to get started with Regular Expressions: Oracle SQL Support for Regular Expressions[Video](Doc ID 1369668.1) USER DEFINED FUNCTION [UDF] User Defined Function or UDF provides flexibility for the users to code their own masking logic in PL/SQL, which can be called from masking Defintion. The standard format of an UDF in Oracle Data Masking and Subsetting is: Function udf_func (rowid varchar2, column_name varchar2, original_value varchar2) returns varchar2; Where • rowid is the row identifier of the column that needs to be masked • column_name is the name of the column that needs to be masked • original_value is the column value that needs to be masked You can achieve deterministic masking by using Oracle’s built in hash functions like, ORA_HASH, DBMS_CRYPTO.MD4, DBMS_CRYPTO.MD5, DBMS_UTILITY. GET_HASH_VALUE.Please refers to the Oracle Database Documentation for more information on the Oracle Hash functions. For example the following masking UDF generate deterministic unique hexadecimal values for a given string input: CREATE OR REPLACE FUNCTION RD_DUX (rid varchar2, column_name varchar2, orig_val VARCHAR2) RETURN VARCHAR2 DETERMINISTIC PARALLEL_ENABLE IS stext varchar2 (26); no_of_characters number(2); BEGIN no_of_characters:=6; stext:=substr(RAWTOHEX(DBMS_CRYPTO.HASH(UTL_RAW.CAST_TO_RAW(text),1)),0,no_of_characters); RETURN stext; END; The uniqueness depends on the input and length of the string and number of bits used by hash algorithm. In the above function MD4 hash is used [denoted by argument 1 in the DBMS_CRYPTO.HASH function which is a 128 bit algorithm which produces 2^128-1 unique hashed values , however this is limited by the length of the input string which is 6, so only 6^6 unique values will be generated. Also do not forget about the birthday paradox/pigeonhole principle mentioned earlier in this post. An another example is to consistently replace characters or numbers preserving the length and special characters as shown below: CREATE OR REPLACE FUNCTION RD_DUS(rid varchar2,column_name varchar2,orig_val VARCHAR2) RETURN VARCHAR2 DETERMINISTIC PARALLEL_ENABLE IS stext varchar2(26); BEGIN DBMS_RANDOM.SEED(orig_val); stext:=TRANSLATE(orig_val,'ABCDEFGHILKLMNOPQRSTUVWXYZ',DBMS_RANDOM.STRING('U',26)); stext:=TRANSLATE(stext,'abcdefghijklmnopqrstuvwxyz',DBMS_RANDOM.STRING('L',26)); stext:=TRANSLATE(stext,'0123456789',to_char(DBMS_RANDOM.VALUE(1,9))); stext:=REPLACE(stext,'.','0'); RETURN stext; END; The following screen shot shows the usage of an UDF with in a masking definition: To summarize, Oracle Data Masking and Subsetting helps you to consistently mask data across databases using one or all of the methods described in this post. It saves the hassle of identifying the parent-child relationships defined in the application table. Happy Masking

    Read the article

  • Project Management Helps AmeriCares Deliver International Aid

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss Handle with Care Sound project management helps AmeriCares bring international aid to those in need. The stakes are always high for AmeriCares. On a mission to restore health and save lives during times of disaster, the nonprofit international relief and humanitarian aid organization delivers donated medicines, medical supplies, and humanitarian aid to people in the U.S. and around the globe. Founded in 1982 with the express mission of responding as quickly and efficiently as possible to help people in need, the Stamford, Connecticut-based AmeriCares has delivered more than US$10.5 billion in aid to 147 countries over the past three decades. Launch the Slideshow “It’s critically important to us that we steward all the donations and that the medical supplies and medicines get to people as quickly as possible with no loss,” says Kate Sears, senior vice president for finance and technology at AmeriCares. “Whether we’re shipping IV solutions to victims of cholera in Haiti or antibiotics to Somali famine victims, we need to get the medicines there sooner because it means more people will be helped and lives improved or even saved.” Ten years ago, the tracking systems used by AmeriCares associates were paper-based. In recent years, staff started using spreadsheets, but the tracking processes were not standardized between teams. “Every team was tracking completely different information,” says Megan McDermott, senior associate, Sub-Saharan Africa partnerships, at AmeriCares. “It was just a few key things. For example, we tracked the date a shipment was supposed to arrive and the date we got reports from our partner that a hospital received aid on their end.” While the data was accurate, much detail was being lost in the process. AmeriCares management knew it could do a better job of tracking this enterprise data and in 2011 took a significant step by implementing Oracle’s Primavera P6 Professional Project Management. “It’s a comprehensive solution that has helped us improve the monitoring and controlling processes. It has allowed us to do our distribution better,” says Sears. In addition, the implementation effort has been a change agent, helping AmeriCares leadership rethink project management across the entire organization. Initially, much of the focus was on standardizing processes, but staff members also learned the importance of thinking proactively to prevent possible problems and evaluating results to determine if goals and objectives are truly being met. Such data about process efficiency and overall results is critical not only to AmeriCares staff but also to the donors supporting the organization’s life-saving missions. Efficiency Saves Lives One of AmeriCares’ core operations is to gather product donations from the private sector, establish where the most-urgent needs are, and solicit monetary support to send the aid via ocean cargo or airlift to welfare- and health-oriented nongovernmental organizations, hospitals, health networks, and government ministries based in areas in need. In 2011 alone, AmeriCares sent more than 3,500 shipments to 95 countries in response to both ongoing humanitarian needs and more than two dozen emergencies, including deadly tornadoes and storms in the U.S. and the devastating tsunami in Japan. When it comes to nonprofits in general, donors want to know that the charitable organizations they support are using funds wisely. Typically, nonprofits are evaluated by donors in terms of efficiency, an area where AmeriCares has an excellent reputation: 98 percent of expenses go directly to supporting programs and less than 2 percent represent administrative and fundraising costs. Donors, however, should look at more than simple efficiency, says Peter York, senior partner and chief research and learning officer at TCC Group, a nonprofit consultancy headquartered in New York, New York. They should also look at whether organizations have the systems in place to sustain their missions and continue to thrive. An expert on nonprofit organizational management, York has spent years studying sustainable charitable organizations. He defines them as nonprofits that are able to achieve the ongoing financial support to stay relevant and continue doing core mission work. In his analysis of well over 2,500 larger nonprofits, York has found that many are not sustaining, and are actually scaling back in size. “One of the biggest challenges of nonprofit sustainability is the general public’s perception that every dollar donated has to go only to the delivery of service,” says York. “What our data shows is that there are some fundamental capacities that have to be there in order for organizations to sustain and grow.” York’s research highlights the importance of data-driven leadership at successful nonprofits. “You’ve got to have the tools, the systems, and the technologies to get objective information on what you do, the people you serve, and the results you’re achieving,” says York. “If leaders don’t have the knowledge and the data, they can’t make the strategic decisions about programs to take organizations to the next level.” Historically, AmeriCares associates have used time-tested and cost-effective strategies to ship and then track supplies from donation to delivery to their destinations in designated time frames. When disaster strikes, AmeriCares ships by air and generally pulls out all the stops to deliver the most urgently needed aid within the first few days and weeks. Then, as situations stabilize, AmeriCares turns to delivering sea containers for the postemergency and ongoing aid so often needed over the long term. According to McDermott, getting a shipment out the door is fairly complicated, requiring as many as five different AmeriCares teams collaborating together. The entire process can take months—from when products are received in the warehouse and deciding which recipients to allocate supplies to, to getting customs and governmental approvals in place, actually shipping products, and finally ensuring that the products are received in-country. Delivering that aid is no small affair. “Our volume exceeds half a billion dollars a year worth of donated medicines and medical supplies, so it’s a sizable logistical operation to bring these products in and get them out to the right place quickly to have the most impact,” says Sears. “We really pride ourselves on our controls and efficiencies.” Adding to that complexity is the fact that the longer it takes to deliver aid, the more dire the human need can be. Any time AmeriCares associates can shave off the complicated aid delivery process can translate into lives saved. “It’s really being able to track information consistently that will help us to see where are the bottlenecks and where can we work on improving our processes,” says McDermott. Setting a Standard Productivity and information management improvements were key objectives for AmeriCares when staff began the process of implementing Oracle’s Primavera solution. But before configuring the software, the staff needed to take the time to analyze the systems already in place. According to Greg Loop, manager of database systems at AmeriCares, the organization received guidance from several consultants, including Rich D’Addario, consulting project manager in the Primavera Global Business Unit at Oracle, who was instrumental in shepherding the critical requirements-gathering phase. D’Addario encouraged staff to begin documenting shipping processes by considering the order in which activities occur and which ones are dependent on others to get accomplished. This exercise helped everyone realize that to be more efficient, they needed to keep track of shipments in a more standard way. “The staff didn’t recognize formal project management methodology,” says D’Addario. “But they did understand what the most important things are and that if they go wrong, an entire project can go off course.” Before, if a boatload of supplies was being sent to Haiti and there was a problem somewhere, a lot of time was taken up finding out where the problem was—because staff was not tracking things in a standard way. As a result, even more time was needed to find possible solutions to the problem and alert recipients that the aid might be delayed. “For everyone to put on the project manager hat and standardize the way every single thing is done means that now the whole organization is on the same page as to what needs to occur from the time a hurricane hits Haiti and when a boat pulls in to unload supplies,” says D’Addario. With so much care taken to put a process foundation firmly in place, configuring the Primavera solution was actually quite simple. Specific templates were set up for different types of shipments, and dashboards were implemented to provide executives with clear overviews of every project in the system. AmeriCares’ Loop reports that system planning, refining, and testing, followed by writing up documentation and training, took approximately four months. The system went live in spring 2011 at AmeriCares’ Connecticut headquarters. While the nonprofit has an international presence, with warehouses in Europe and offices in Haiti, India, Japan, and Sri Lanka, most donated medicines come from U.S. entities and are shipped from the U.S. out to the rest of the world. In addition, all shipments are tracked from the U.S. office. AmeriCares doesn’t expect the Primavera system to take months off the shipping time, especially for sea containers. However, any time saved is still important because it will allow aid to be delivered to people more quickly at a lower overall cost. “If we can trim a day or two here or there, that can translate into lives that we’re saving, especially in emergency situations,” says Sears. A Cultural Change Beyond the measurable benefits that come with IT-driven process improvement, AmeriCares management is seeing a change in culture as a result of the Primavera project. One change has been treating every shipment of aid as a project, and everyone involved with facilitating shipments as a project manager. “This is a revolutionary concept for us,” says McDermott. “Before, we were used to thinking we were doing logistics—getting a container from point A to point B without looking at it as one project and really understanding what it meant to manage it.” AmeriCares staff is also happy to report that collaboration within the organization is much more efficient. When someone creates a shipment in the Primavera system, the same shared template is used, which means anyone can log in to the system to see the status of a shipment. Knowledgeable staff can access a shipment project to help troubleshoot a problem. Management can easily check the status of projects across the organization. “Dashboards are really useful,” says McDermott. “Instead of going into the details of each project, you can just see the high-level real-time information at a glance.” The new system is helping team members focus on proactively managing shipments rather than simply reacting when problems occur. For example, when a container is shipped, documents must be included for customs clearance. Now, the shipping template has built-in reminders to prompt team members to ask for copies of these documents from freight forwarders and to follow up with partners to discover if a shipment is on time. In the past, staff may not have worked on securing these documents until they’d been notified a shipment had arrived in-country. Another benefit of capturing and adopting best practices within the Primavera system is that staff training is easier. “Capturing the processes in documented steps and milestones allows us to teach new staff members how to do their jobs faster,” says Sears. “It provides them with the knowledge of their predecessors so they don’t have to keep reinventing the wheel.” With the Primavera system already generating positive results, management is eager to take advantage of advanced capabilities. Loop is working on integrating the company’s proprietary inventory management system with the Primavera system so that when logistics or warehousing operators input data, the information will automatically go into the Primavera system. In the past, this information had to be manually keyed into spreadsheets, often leading to errors. Mining Historical Data Another feature on the horizon for AmeriCares is utilizing Primavera P6 Professional Project Management reporting capabilities. As the system begins to include more historical data, management soon will be able to draw on this information to conduct analysis that has not been possible before and create customized reports. For example, at the beginning of the shipment process, staff will be able to use historical data to more accurately estimate how long the approval process should take for a particular country. This could help ensure that food and medicine with limited shelf lives do not get stuck in customs or used beyond their expiration dates. The historical data in the Primavera system will also help AmeriCares with better planning year to year. The nonprofit’s staff has always put together a plan at the beginning of the year, but this has been very challenging simply because it is impossible to predict disasters. Now, management will be able to look at historical data and see trends and statistics as they set current objectives and prepare for future need. In addition, this historical data will provide AmeriCares management with the ability to review year-end data and compare actual project results with goals set at the beginning of the year—to see if desired outcomes were achieved and if there are areas that need improvement. It’s this type of information that is so valuable to donors. And, according to York, project management software can play a critical role in generating the data to help nonprofits sustain and grow. “It is important to invest in systems to help replicate, expand, and deliver services,” says York. “Project management software can help because it encourages nonprofits to examine program or service changes and how to manage moving forward.” Sears believes that AmeriCares donors will support the return on investment the organization will achieve with the Primavera solution. “It won’t be financial returns, but rather how many more people we can help for a given dollar or how much more quickly we can respond to a need,” says Sears. “I think donors are receptive to such arguments.” And for AmeriCares, it is all about the future and increasing results. The project management environment currently may be quite simple, but IT staff plans to expand the complexity and functionality as the organization grows in its knowledge of project management and the goals it wants to achieve. “As we use the system over time, we’ll continue to refine our best practices and accumulate more data,” says Sears. “It will advance our ability to make better data-driven decisions.”

    Read the article

  • SSIS Training 15-19 Oct in Reston Virginia

    - by andyleonard
    Early bird registration is now open for Linchpin People ’s SSIS training course From Zero To SSIS scheduled for 15-19 Oct 2012 in Reston Virginia! Register today – the early bird discount ends 28 Sep 2012. Training Description From Zero to SSIS was developed by Andy Leonard to train technology professionals in the fine art of using SQL Server Integration Services (SSIS) to build data integration and Extract-Transform-Load (ETL) solutions. The training is focused around labs and emphasizes a hands-on...(read more)

    Read the article

  • Firefox Slow down and 100% CPU after gnome-settings-deamon update

    - by digitaljail
    I'm on Ubuntu 12.04 (Unity) with Firefox 17.0.1 instaled. after the latest update of the gnome-settings-deamon (3.4.2-0ubuntu0.5,3.4.2-0ubuntu0.6) FireFox starts taking 100% of my CPU, periodically. I have tried various things: 1) Disabled All the non standard extensions = No change to the CPU Usage 2) Disabled Flash PlugIns (also updated same time) = No change to CPU Usage 3) Disabled "Global Menu integration 3.6.4) Extension = HOOOA CPU OK !!! Any suggestion to get back global menu integration with no more CPU problems?

    Read the article

  • Necessitas : le port Android de Qt, sera bientôt intégré au Qt Project, le support officiel d'Android est prévu pour 2013

    Il était déjà annoncé que Digia envisageait de supporter les plateformes mobiles de mieux en mieux avec Qt, avec pour objectif l'intégration d'Android et iOS dans les plateformes supportées en tier 1 courant 2013 (c'est-à-dire comme plateformes principales). Une des pistes était l'intégration du code de Necessitas, le port lancé par BogDan Vatra pour Android : la bonne nouvelle du jour est que les deux parties se sont mises d'accord pour que cela arrive ! Le port Android de Qt 5 sera basé sur le projet Necessitas, BogDan souhaitant en devenir mainteneur (selon l'organisation du Qt Project : http://qt.developpez.com/actu/38218/...rriv...

    Read the article

  • Top Exastack ISV Headlines: Volante Technologies

    - by Javier Puerta
    Volante Achieves Oracle Exalogic Optimized, Oracle Exadata Ready, Oracle SuperCluster Ready, Oracle Solaris Ready, Oracle Linux Ready and Oracle VM Status. Volante Suite is a suite of modular tools for integration and management of financial data. Used today by major financial institutions, exchanges and industry utilities around the world, Volante enables users to rapidly build data integration solutions among market data feeds, applications and external partners. Volante finds Exalogic is strong fit for addressing scalability and operating in the most challenging financial services environments. Read more here.

    Read the article

  • dot42 vs Xamarin to develop in C# for Android phones [on hold]

    - by opt
    does anyone have experience with both dot42 and Xamarin to develop C# apps for Android? Could you please say why one should prefer one over the other? I like the free Visual Studio integration that is offered by dot42 (while Xamarin requires a subscription for this, and the business one which is quite expensive). I would also like to know if Visual Studio integration actually means that one could use all libraries available for desktop apps. Thank you.

    Read the article

  • SQLAuthority News Interesting Whitepaper We Loaded 1TB in 30 Minutes with SSIS, and So Can You

    In February 2008, Microsoft announced a record-breaking data load using Microsoft SQL Server Integration Services (SSIS): 1 TB of data in less than 30 minutes. That data load, using SQL Server Integration Services, was 30% faster than the previous best time using a commercial ETL tool. This paper outlines what it took: the software, hardware, [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • API Management Video

    - by Michael Stephenson
    Originally posted on: http://geekswithblogs.net/michaelstephenson/archive/2014/08/03/157900.aspxJust wanted to put the word out that the API Management video from the recent user group meeting is available.  The page on the below link has resources from that meeting:http://ukcsug.co.uk/past-events/2014-07-07/ Also we have out next two meetings available for registration at the following links:Hybrid Connectionshttps://www.eventbrite.com/e/azure-biztalk-services-hybrid-connections-tickets-12216617231?aff=eorg Hybrid Integration with Dynamics CRMhttps://www.eventbrite.com/e/hybrid-integration-with-microsoft-dynamics-crm-tickets-12398067955?aff=eorg

    Read the article

  • SSIS Training Comes to NYC 30 Jul-3 Aug!

    - by andyleonard
    Linchpin People is excited to announce the scheduling of From Zero To SSIS in New York City 30 Jul – 03 Aug 2012! Training Description From Zero to SSIS was developed by Andy Leonard to train technology professionals in the fine art of using SQL Server Integration Services (SSIS) to build data integration and Extract-Transform-Load (ETL) solutions. The training is focused around labs and emphasizes a hands-on approach. Most technologists learn by doing; this training is designed to maximize the time...(read more)

    Read the article

  • SSIS Training Comes to NYC 30 Jul-3 Aug!

    - by andyleonard
    Linchpin People is excited to announce the scheduling of From Zero To SSIS in New York City 30 Jul – 03 Aug 2012! Training Description From Zero to SSIS was developed by Andy Leonard to train technology professionals in the fine art of using SQL Server Integration Services (SSIS) to build data integration and Extract-Transform-Load (ETL) solutions. The training is focused around labs and emphasizes a hands-on approach. Most technologists learn by doing; this training is designed to maximize the time...(read more)

    Read the article

  • Microsoft Launches Pre-release Updates of AppFabric, BizTalk

    Microsoft Application Infrastructure Virtual Conference highlights integration between new cloud-savvy, .NET middleware and Microsoft's integration server....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to Integrate Facebook and Twitter with Java Applications

    Exposure on social media sites such as Facebook, Twitter and LinkedIn has a tremendous impact on business, effecting marketing, networking, analytics and more. Social media is a natural resource for collecting user feedback, comments, suggestions, etc., making the integration of social media with applications increasingly important. This article will discuss the integration of Facebook and Twitter with a Java application.

    Read the article

  • New stuff on Windows Live &hellip; Hotmail and Messenger to start

    - by Enrique Lima
    Where to start?  First, I would recommend you visit the Preview at http://windowslivepreview.com/hotmail/launch/?lang=en  Then Start Exploring, here you will find the information on Hotmail and Messenger. Why would this be exciting? The Hotmail offering is having the following pieces integrated: ActiveSync Office Web Apps integration SkyDrive integration Head on over, check it out.

    Read the article

  • ClickOnce Deployment

    This article is taken from the book Continuous Integration in .NET. The authors discuss ClickOnce, a simple way to make a Windows application available over the net without using installation files. They show how to use ClickOnce within the Continuous Integration process.

    Read the article

  • Tracking changes in Entity Framework 4.0 using POCO Dynamic Proxies across multiple data contexts.

    - by Rob Packwood
    I started messing with EF 4.0 because I am curious about the POCO possibilities... I wanted to simulate disconnected web environment and wrote the following code to simulate this: Save a test object in the database. Retrieve the test object Dispose of the DataContext associated with the test object I used to retrieve it Update the test object Create a new data context and persist the changes on the test object that are automatically tracked within the DynamicProxy generated against my POCO object. The problem is that when I call dataContext.SaveChanges in the Test method above, the updates are not applied. The testStore entity shows a status of "Modified" when I check its EntityStateTracker, but it is no longer modified when I view it within the new dataContext's Stores property. I would have thought that calling the Attach method on the new dataContext would also bring the object's "Modified" state over, but that appears to not be the case. Is there something I am missing? I am definitely working with self-tracking POCOs using DynamicProxies. private static void SaveTestStore(string storeName = "TestStore") { using (var context = new DataContext()) { Store newStore = context.Stores.CreateObject(); newStore.Name = storeName; context.Stores.AddObject(newStore); context.SaveChanges(); } } private static Store GetStore(string storeName = "TestStore") { using (var context = new DataContext()) { return (from store in context.Stores where store.Name == storeName select store).SingleOrDefault(); } } [Test] public void Test_Store_Update_Using_Different_DataContext() { SaveTestStore(); Store testStore = GetStore(); testStore.Name = "Updated"; using (var dataContext = new DataContext()) { dataContext.Stores.Attach(testStore); dataContext.SaveChanges(SaveOptions.DetectChangesBeforeSave); } Store updatedStore = GetStore("Updated"); Assert.IsNotNull(updatedStore); }

    Read the article

  • Unable to validate data. at System.Web.Configuration.MachineKeySection.GetDecodedData

    - by Ben Williams
    I have several websites which get approximately 3000 pageviews in total per day, and I get this viewstate error roughly 5-10 times per day, caught in global.asax: System.Web.HttpException: Unable to validate data. at System.Web.Configuration.MachineKeySection.GetDecodedData(Byte[] buf, Byte[] modifier, Int32 start, Int32 length, Int32& dataLength) at System.Web.UI.ObjectStateFormatter.Deserialize(String inputString) I have tried: hard-coding the machine key in web.config for all websites hard-coding the machien key in machine.config adding items to the pages section of the web.config for all websites. Machine key looks like: <machineKey validationKey="key goes here" decryptionKey="key goes here" validation="SHA1" decryption="AES" /> Pages section looks like: <pages renderAllHiddenFieldsAtTopOfForm="true" validateRequest="false" enableEventValidation="false" viewStateEncryptionMode="Never"> The errors are not related to application pool recycling as best I can tell, as the pool is set to recycle at every 100,000 requests. I am not running a web farm or web garden. Quite often I get two or three of these errors in a row, as if a user is getting an error, going back, and then clicking the link again. Anyone have any ideas?

    Read the article

< Previous Page | 537 538 539 540 541 542 543 544 545 546 547 548  | Next Page >