Search Results

Search found 18318 results on 733 pages for 'dynamic management'.

Page 11/733 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Can I have both Windows dynamic disk partition and some other non-Windows partition on the same disk?

    - by haimg
    When a basic Windows disk is converted to dynamic, Windows creates a partition that span the whole disk with the type of "Windows LVM" and manages its dynamic partitions within this space. So even if there is still free space on this disk, it is not visible to any other OS besides Windows. This happens with MBR and GPT disks both. I would like to share a Windows dynamic disk with some other OS. I have to have dynamic disks because I use Windows SoftRaid (mirrors). So, my questions are: Is there any way to "force" Windows to take up less then the whole disk when it converts a basic disk to dynamic? Will Windows tolerate having some other non-Windows partition on its dynamic disk?

    Read the article

  • What's New in Database Lifecycle Management in Enterprise Manager 12c Release 3

    - by HariSrinivasan
    Enterprise Manager 12c Release 3 includes improvements and enhancements across every area of the product. This blog provides an overview of the new and enhanced features in the Database Lifecycle Management area. I will deep dive into specific features more in depth in subsequent posts. "What's New?"  In this release, we focused on four things: 1. Lifecycle Management Support for new Database12c - Pluggable Databases 2. Management of long running processes, such as a security patch cycle (Change Activity Planner) 3. Management of large number of systems by · Leveraging new framework capabilities for lifecycle operations, such as the new advanced ‘emcli’ script option · Refining features such as configuration search and compliance 4. Minor improvements and quality fixes to existing features · Rollback support for Single instance databases · Improved "OFFLINE" Patching experience · Faster collection of ORACLE_HOME configurations Lifecycle Management Support for new Database 12c - Pluggable Databases Database 12c introduces Pluggable Databases (PDBs), the brand new addition to help you achieve your consolidation goals. Pluggable databases offer unprecedented consolidation at database level and native lifecycle verbs for creating, plugging and unplugging the databases on a container database (CDB). Enterprise Manager can supplement the capabilities of pluggable databases by offering workflows for migrating, provisioning and cloning them using the software library and the deployment procedures. For example, Enterprise Manager can migrate an existing database to a PDB or clone a PDB by storing a versioned copy in the software library. One can also manage the planned downtime related to patching by  migrating the PDBs to a new CDB. While pluggable databases offer these exciting features, it can also pose configuration management and compliance challenges if not managed properly. Enterprise Manager features like inventory management, topology associations and configuration search can mitigate the sprawl of PDBs and also lock them to predefined golden standards using configuration comparison and compliance rules. Learn More ... Management of Long Running datacenter processes - Change Activity Planner (CAP) Currently, customers resort to cumbersome methods to create, execute, track and monitor change activities within their data center. Some customers use traditional tools such as spreadsheets, project planners and in-house custom built solutions. Customers often have weekly sync up meetings across stake holders to collect status and updates. Some of the change activities, for example the quarterly patch set update (PSU) patch rollouts are not single tasks but processes with multiple tasks. Some of those tasks are performed within Enterprise Manager Cloud Control (for example Patch) and some are performed outside of Enterprise Manager Cloud Control. These tasks often run for a longer period of time and involve multiple people or teams. Enterprise Manger Cloud Control supports core data center operations such as configuration management, compliance management, and automation. Enterprise Manager Cloud Control release 12.1.0.3 leverages these capabilities and introduces the Change Activity Planner (CAP). CAP provides the ability to plan, execute, and track change activities in real time. It covers the typical datacenter activities that are spread over a long period of time, across multiple people and multiple targets (even target types). Here are some examples of Change Activity Process in a datacenter: · Patching large environments (PSU/CPU Patching cycles) · Upgrading large number of database environments · Rolling out Compliance Rules · Database Consolidation to Exadata environments CAP provides user flows for Compliance Officers/Managers (incl. lead administrators) and Operators (DBAs and admins). Managers can create change activity plans for various projects, allocate resources, targets, and groups affected. Upon activation of the plan, tasks are created and automatically assigned to individual administrators based on target ownership. Administrators (DBAs) can identify their tasks and understand the context, schedules, and priorities. They can complete tasks using Enterprise Manager Cloud Control automation features such as patch plans (or in some cases outside Enterprise Manager). Upon completion, compliance is evaluated for validations and updates the status of the tasks and the plans. Learn More about CAP ...  Improved Configuration & Compliance Management of a large number of systems Improved Configuration Comparison:  Get to the configuration comparison results faster for simple ad-hoc comparisons. When performing a 1 to 1 comparison, Enterprise Manager will perform the comparison immediately and take the user directly to the results without having to wait for a job to be submitted and executed. Flattened system comparisons reduce comparison setup time and reduce complexity. In addition to the previously existing topological comparison, users now have an option to compare using a “flattened” methodology. Flattening means to remove duplicate target instances within the systems and remove the hierarchy of member targets. The result are much easier to spot differences particularly for specific use cases like comparing patch levels between complex systems like RAC and Fusion Apps. Improved Configuration Search & Advanced EMCLI Script option for Mass Automation Enterprise manager 12c introduces a new framework level capability to be able to script and stitch together multiple tasks using EMCLI. This powerful capability can be leveraged for lifecycle operations, especially when executing a task over a large number of targets. Specific usages of this include, retrieving a qualified list of targets using Configuration Search and then using the resultset for automation. Another example would be executing a patching operation and then re-executing on targets where it may have failed. This is complemented by other enhancements, such as a better usability for designing reusable configuration searches. IN EM 12c Rel 3, a simplified UI makes building adhoc searches even easier. Searching for missing patches is a common use of configuration search. This required the use of the advanced options which are now clearly defined and easy to use. Perform “Configuration Search” using the EMCLI. Users can find and execute Configuration Searches from the EMCLI which can be extremely useful for building sophisticated automation scripts. For an example, Run the Search named “Oracle Databases on Exadata” which finds all Database targets running on top of Exadata. Further filter the results by refining by options like name, host, etc.. emcli get_targets -config_search="Databases on Exadata" –target_name="exa%“ Use this in powerful mass automation operations using the new emcli script option. For example, to solve the use case of – Finding all DBs running on Exadata and housing E-Biz and Patch them. Create a Python script with emcli functions and invoke it in the new EMCLI script option shell. Invoke the script in the new EMCLI with script option directly: $<path to emcli>/emcli @myPSU_Patch.py Richer compliance content:  Now over 50 Oracle Provided Compliance Standards including new standards for Pluggable Database, Fusion Applications, Oracle Identity Manager, Oracle VM and Internet Directory. 9 Oracle provided Real Time Monitoring Standards containing over 900 Compliance Rules across 500 Facets. These new Real time Compliance Standards covers both Exadata Compute nodes and Linux servers. The result is increased Oracle software coverage and faster time to compliance monitoring on Exadata. Enhancements to Patch Management: Overhauled "OFFLINE" Patching experience: Simplified Patch uploads UI to improve the offline experience of patching. There is now a single step process to get the patches into software library. Customers often maintain local repositories of patches, sometimes called software depots, where they host the patches downloaded from My Oracle Support. In the past, you had to move these patches to your desktop then upload them to the Enterprise Manager's Software library through the Enterprise Manager Cloud Control user interface. You can now use the following EMCLI command to upload multiple patches directly from a remote location within the data center: $emcli upload_patches -location <Path to Patch directory> -from_host <HOSTNAME> The upload process filters all of the new patches, automatically selects the relevant metadata files from the location, and uploads the patches to software library. Other Improvements:  Patch rollback for single instance databases, new option in the Patch Plan to rollback the patches added to the patch plans. Upon execution, the procedure would rollback the patch and the SQL applied to the single instance Databases. Improved and faster configuration collection of Oracle Home targets can enable more reliable automation at higher level functions like Provisioning, Patching or Database as a Service. Just to recap, here is a list of database lifecycle management features:  * Red highlights mark – New or Enhanced in the Release 3. • Discovery, inventory tracking and reporting • Database provisioning including o Migration to Pluggable databases o Plugging and unplugging of pluggable databases o Gold image based cloning o Scaling of RAC nodes •Schema and data change management •End-to-end patch management in online and offline modes, including o Patch advisories in online (connected with My Oracle Support) and offline mode o Patch pre-deployment analysis, deployment and rollback (currently only for single instance databases) o Reporting • Upgrade planning and execution of the upgrade process • Configuration management including • Compliance management with out-of-box content • Change Activity Planner for planning, designing and tracking long running processes For more information on Enterprise Manager’s database lifecycle management capabilities, visit http://www.oracle.com/technetwork/oem/lifecycle-mgmt/index.html

    Read the article

  • Dynamic DNS with Comcast

    - by colithium
    I've just recently moved across town. Previously, I had Dynamic DNS set up so I could remotely connect to my desktop (primarily to use TightVNC). My ISP was Comcast and I'm in the Denver, Colorado area. Currently, I'm still with Comcast and still in Denver. My router connects to the internet just fine and my Dynamic DNS record over at DynDNS did get updated with my router's current external IP address. So my router, DynDNS, and public DNS records all agree what my IP address is. However, I can't actually connect to anything from the outside world. My trace route to Google looks something like: Tracing route to google.com [74.125.19.147] 1 3 ms 1 ms 1 ms 192.168.1.1 (this is the internal IP address of my router) 2 * * * Request timed out. 3 9 ms 8 ms 10 ms te-8-2-ur02.wheatridge.co.denver.comcast.net [68.85.221.177] 4 12 ms 12 ms 19 ms te-0-8-0-2-ar02.aurora.co.denver.comcast.net [68.86.103.97] 5 16 ms 13 ms 11 ms pos-0-3-0-0-cr01.denver.co.ibone.comcast.net [68.86.91.1] 6 28 ms 28 ms 27 ms pos-0-9-0-0-cr01.dallas.tx.ibone.comcast.net [68.86.85.174] 7 29 ms 27 ms 28 ms pos-0-1-0-0-pe01.1950stemmons.tx.ibone.comcast.net [68.86.86.94] 8 66 ms 108 ms * 75.149.231.70 9 65 ms 68 ms 93 ms 72.14.233.77 10 67 ms 66 ms 66 ms 72.14.233.111 11 67 ms 67 ms 69 ms 216.239.43.144 12 68 ms 71 ms 73 ms 209.85.249.30 13 66 ms 66 ms 68 ms nuq04s01-in-f147.1e100.net [74.125.19.147] This is what the trace route looks like from an outside source to my DynDNS domain name: traceroute to 98.245.67.65 (98.245.67.65) 1 illuminati-130 138.67.130.61 2 138.67.63.253 138.67.63.253 3 vermiculite 138.67.253.20 4 csm-ct-gw 138.67.253.244 5 138.67.253.2 138.67.253.2 6 ge-7-24-ar01.denver.co.denver.comcast.net 68.86.128.17 7 te-0-4-0-0-ar02.denver.co.denver.comcast.net 68.86.179.21 8 te-9-3-ur01.wheatridge.co.denver.comcast.net 68.86.103.18 9 * * * {Times Out} Now my guess is, whatever is sitting just beyond my router (what the modem connects to) is gumming things up. Even though the routes aren't EXACTLY the same, that appears to be the spot that the trace route either stops or doesn't get a response. My question is, for Comcast networks (particularly in Denver), what would be the device that typically sits there? Is there anything I can do about it? That device seems to not respond to PING but does forward it along when I'm going outwards. But it looks like it eats it when the request is coming in. It's hard to prove that from these logs but I'm assuming that's the case because my router used to accept connections from the outside and I haven't changed anything on it.

    Read the article

  • Can't activate dynamic disk

    - by SuperFurryToad
    For some reason I can't activate a dynamic disk connected to my PC via a USB hard disk enclosure. It has activated previously without any problems. When I check the event logs, I get this message every time I try to activate the disk: dmio: Harddisk2 write error at block 2930277167: status 0xc0000015

    Read the article

  • Dynamic Disk Start on Boot

    - by Xiuhtecuhtli
    Is there a way to make a Dynamic Disc Automatically Come on line after a reboot. at the moment i must manually Bring the disk "online". I was thinking a Startup Script would do the trick. does anyone know of a URL HOWTO or a Premade script to do this?

    Read the article

  • Using dynamic dns updater as Windows service

    - by Sami-L
    Currently I am using a dynamic dns updater software on Windows, but to keep it updating I have not to close it, also the software is not able to run at backgoud, I searched on google but the answer became complicated since there are many details to pay attention to, free, not free, installed on machine, configured on router, trusted, not trusted ... So please help me find a trusted free dns updater for Windows which can run as service, and maybe which can send email when update fails.

    Read the article

  • Recover data from a Windows Dynamic Volume Spanned Disk

    - by iCe
    I have a dynamic volume created with two spanned partitions over two disk. Recently, one disk has started failing, and I want to copy the data inside that disk to another disk, before replacing it. However I don't know how to select only what is inside the failing disk, because the partitions spans across both disks. Maybe imaging the entire disk should do the work? Or I have to copy all the data from both disks? Thanks in advance!

    Read the article

  • Chock-full of Identity Customers at Oracle OpenWorld

    - by Tanu Sood
      Oracle Openworld (OOW) 2012 kicks off this coming Sunday. Oracle OpenWorld is known to bring in Oracle customers, organizations big and small, from all over the world. And, Identity Management is no exception. If you are looking to catch up with Oracle Identity Management customers, hear first-hand about their implementation experiences and discuss industry trends, business drivers, solutions and more at OOW, here are some sessions we recommend you attend: Monday, October 1, 2012 CON9405: Trends in Identity Management 10:45 a.m. – 11:45 a.m., Moscone West 3003 Subject matter experts from Kaiser Permanente and SuperValu share the stage with Amit Jasuja, Snior Vice President, Oracle Identity Management and Security to discuss how the latest advances in Identity Management are helping customers address emerging requirements for securely enabling cloud, social and mobile environments. CON9492: Simplifying your Identity Management Implementation 3:15 p.m. – 4:15 p.m., Moscone West 3008 Implementation experts from British Telecom, Kaiser Permanente and UPMC participate in a panel to discuss best practices, key strategies and lessons learned based on their own experiences. Attendees will hear first-hand what they can do to streamline and simplify their identity management implementation framework for a quick return-on-investment and maximum efficiency. CON9444: Modernized and Complete Access Management 4:45 p.m. – 5:45 p.m., Moscone West 3008 We have come a long way from the days of web single sign-on addressing the core business requirements. Today, as technology and business evolves, organizations are seeking new capabilities like federation, token services, fine grained authorizations, web fraud prevention and strong authentication. This session will explore the emerging requirements for access management, what a complete solution is like, complemented with real-world customer case studies from ETS, Kaiser Permanente and TURKCELL and product demonstrations. Tuesday, October 2, 2012 CON9437: Mobile Access Management 10:15 a.m. – 11:15 a.m., Moscone West 3022 With more than 5 billion mobile devices on the planet and an increasing number of users using their own devices to access corporate data and applications, securely extending identity management to mobile devices has become a hot topic. This session will feature Identity Management evangelists from companies like Intuit, NetApp and Toyota to discuss how to extend your existing identity management infrastructure and policies to securely and seamlessly enable mobile user access. CON9491: Enhancing the End-User Experience with Oracle Identity Governance applications 11:45 a.m. – 12:45 p.m., Moscone West 3008 As organizations seek to encourage more and more user self service, business users are now primary end users for identity management installations.  Join experts from Visa and Oracle as they explore how Oracle Identity Governance solutions deliver complete identity administration and governance solutions with support for emerging requirements like cloud identities and mobile devices. CON9447: Enabling Access for Hundreds of Millions of Users 1:15 p.m. – 2:15 p.m., Moscone West 3008 Dealing with scale problems? Looking to address identity management requirements with million or so users in mind? Then take note of Cisco’s implementation. Join this session to hear first-hand how Cisco tackled identity management and scaled their implementation to bolster security and enforce compliance. CON9465: Next Generation Directory – Oracle Unified Directory 5:00 p.m. – 6:00 p.m., Moscone West 3008 Get the 360 degrees perspective from a solution provider, implementation services partner and the customer in this session to learn how the latest Oracle Unified Directory solutions can help you build a directory infrastructure that is optimized to support cloud, mobile and social networking and yet deliver on scale and performance. Wednesday, October 3, 2012 CON9494: Sun2Oracle: Identity Management Platform Transformation 11:45 a.m. – 12:45 p.m., Moscone West 3008 Sun customers are actively defining strategies for how they will modernize their identity deployments. Learn how customers like Avea and SuperValu are leveraging their Sun investment, evaluating areas of expansion/improvement and building momentum. CON9631: Entitlement-centric Access to SOA and Cloud Services 11:45 a.m. – 12:45 p.m., Marriott Marquis, Salon 7 How do you enforce that a junior trader can submit 10 trades/day, with a total value of $5M, if market volatility is low? How can hide sensitive patient information from clerical workers but make it visible to specialists as long as consent has been given or there is an emergency? How do you externalize such entitlements to allow dynamic changes without having to touch the application code? In this session, Uberether and HerbaLife take the stage with Oracle to demonstrate how you can enforce such entitlements on a service not just within your intranet but also right at the perimeter. CON3957 - Delivering Secure Wi-Fi on the Tube as an Olympics Legacy from London 2012 11:45 a.m. – 12:45 p.m., Moscone West 3003 In this session, Virgin Media, the U.K.’s first combined provider of broadband, TV, mobile, and home phone services, shares how it is providing free secure Wi-Fi services to the London Underground, using Oracle Virtual Directory and Oracle Entitlements Server, leveraging back-end legacy systems that were never designed to be externalized. As an Olympics 2012 legacy, the Oracle architecture will form a platform to be consumed by other Virgin Media services such as video on demand. CON9493: Identity Management and the Cloud 1:15 p.m. – 2:15 p.m., Moscone West 3008 Security is the number one barrier to cloud service adoption.  Not so for industry leading companies like SaskTel, ConAgra foods and UPMC. This session will explore how these organizations are using Oracle Identity with cloud services and how some are offering identity management as a cloud service. CON9624: Real-Time External Authorization for Middleware, Applications, and Databases 3:30 p.m. – 4:30 p.m., Moscone West 3008 As organizations seek to grant access to broader and more diverse user populations, the importance of centrally defined and applied authorization policies become critical; both to identify who has access to what and to improve the end user experience.  This session will explore how customers are using attribute and role-based access to achieve these goals. CON9625: Taking control of WebCenter Security 5:00 p.m. – 6:00 p.m., Moscone West 3008 Many organizations are extending WebCenter in a business to business scenario requiring secure identification and authorization of business partners and their users. Leveraging LADWP’s use case, this session will focus on how customers are leveraging, securing and providing access control to Oracle WebCenter portal and mobile solutions. Thursday, October 4, 2012 CON9662: Securing Oracle Applications with the Oracle Enterprise Identity Management Platform 2:15 p.m. – 3:15 p.m., Moscone West 3008 Oracle Enterprise identity Management solutions are designed to secure access and simplify compliance to Oracle Applications.  Whether you are an EBS customer looking to upgrade from Oracle Single Sign-on or a Fusion Application customer seeking to leverage the Identity instance as an enterprise security platform, this session with Qualcomm and Oracle will help you understand how to get the most out of your investment. And here’s the complete listing of all the Identity Management sessions at Oracle OpenWorld.

    Read the article

  • SQL Like keyword in Dynamic Linq

    - by Erwin
    Hi fellow programmer I want to use SQL's Like keyword in dynamic LINQ. The query that I want to make is like this select * from table_a where column_a like '%search%' Where the column_a can be dynamically changed to other column etc In this dynamic LINQ var result = db.table_a.Where( a=> (a.column_a.Contains("search")) ); But the column can't be dynamically changed , only the search key can How do we create a dynamic LINQ like var result = db.table_a.Where("column_a == \"search\""); That we can change the column and the search key dynamically

    Read the article

  • A function's static and dynamic parent

    - by legends2k
    I'm reading Thinking in C++ (vol. 2): Whenever a function is called, information about that function is pushed onto the runtime stack in an activation record instance (ARI), also called a stack frame. A typical stack frame contains (1) the address of the calling function (so execution can return to it), (2) a pointer to the ARI of the function’s static parent (the scope that lexically contains the called function, so variables global to the function can be accessed), and (3) a pointer to the function that called it (its dynamic parent). The path that logically results from repetitively following the dynamic parent links is the dynamic chain, or call chain I'm unable to comprehend what the author means as function's static and dynamic parent. Also am not able to differentiate between item 1, 2 or 3. They all seem to be the same. Can someone please explain this passage to me?

    Read the article

  • Getting results in a result set from dynamic SQL in Oracle

    - by msorens
    This question is similar to a couple others I have found on StackOverflow, but the differences are signficant enough to me to warrant a new question, so here it is: I want to obtain a result set from dynamic SQL in Oracle and then display it as a result set in a SqlDeveloper-like tool, just as if I had executed the dynamic SQL statement directly. This is straightforward in SQL Server, so to be concrete, here is an example from SQL Server that returns a result set in SQL Server Management Studio or Query Explorer: EXEC sp_executesql N'select * from countries' Or more properly: DECLARE @stmt nvarchar(100) SET @stmt = N'select * from countries' EXEC sp_executesql @stmt The question "How to return a resultset / cursor from a Oracle PL/SQL anonymous block that executes Dynamic SQL?" addresses the first half of the problem--executing dynamic SQL into a cursor. The question "How to make Oracle procedure return result sets" provides a similar answer. Web search has revealed many variations of the same theme, all addressing just the first half of my question. I found this post explaining how to do it in SqlDeveloper, but that uses a bit of functionality of SqlDeveloper. I am actually using a custom query tool so I need the solution to be self-contained in the SQL code. This custom query tool similarly does not have the capability to show output of print (dbms_output.put_line) statements; it only displays result sets. Here is yet one more possible avenue using 'execute immediate...bulk collect', but this example again renders the results with a loop of dbms_output.put_line statements. This link attempts to address the topic but the question never quite got answered there either. Assuming this is possible, I will add one more condition: I would like to do this without having to define a function or procedure (due to limited DB permissions). That is, I would like to execute a self-contained PL/SQL block containing dynamic SQL and return a result set in SqlDeveloper or a similar tool. So to summarize: I want to execute an arbitrary SQL statement (hence dynamic SQL). The platform is Oracle. The solution must be a PL/SQL block with no procedures or functions. The output must be generated as a canonical result set; no print statements. The output must render as a result set in SqlDeveloper without using any SqlDeveloper special functionality. Any suggestions?

    Read the article

  • Dependecy Injection with Massive ORM: dynamic trouble

    - by Sergi Papaseit
    I've started working on an MVC 3 project that needs data from an enormous existing database. My first idea was to go ahead and use EF 4.1 and create a bunch of POCO's to represent the tables I need, but I'm starting to think the mapping will get overly complicated as I only need some of the columns in some of the tables. (thanks to Steven for the clarification in the comments. So I thought I'd give Massive ORM a try. I normally use a Unit of Work implementation so I can keep everything nicely decoupled and can use Dependency Injection. This is part of what I have for Massive: public interface ISession { DynamicModel CreateTable<T>() where T : DynamicModel, new(); dynamic Single<T>(string where, params object[] args) where T : DynamicModel, new(); dynamic Single<T>(object key, string columns = "*") where T : DynamicModel, new(); // Some more methods supported by Massive here } And here's my implementation of the above interface: public class MassiveSession : ISession { public DynamicModel CreateTable<T>() where T : DynamicModel, new() { return new T(); } public dynamic Single<T>(string where, params object[] args) where T: DynamicModel, new() { var table = CreateTable<T>(); return table.Single(where, args); } public dynamic Single<T>(object key, string columns = "*") where T: DynamicModel, new() { var table = CreateTable<T>(); return table.Single(key, columns); } } The problem comes with the First(), Last() and FindBy() methods. Massive is based around a dynamic object called DynamicModel and doesn't define any of the above method; it handles them through a TryInvokeMethod() implementation overriden from DynamicObject instead: public override bool TryInvokeMember(InvokeMemberBinder binder, object[] args, out object result) { } I'm at a loss on how to "interface" those methods in my ISession. How could my ISession provide support for First(), Last() and FindBy()? Put it another way, how can I use all of Massive's capabilities and still be able to decouple my classes from data access?

    Read the article

  • C# Dynamic Query Without A Database Model

    - by hitopp
    I have been searching the web for a solution to dynamic queries. I have found many different solutions (e.g. Linq to Sql, Dynamic Linq Expressions, Dynamic Query), but all of these solutions involve some sort of previous knowledge of the database (like a model in code). Maybe what I am asking is way off the deep end, but is there any possible way to dynamically query a database without a model? For example, a database has a Customers table with the following columns: CustomerID Name FavoriteColor I want to create a query as SELECT Name FROM Customers WHERE @0 = @1, where the two placeholders are populated dynamically. The resulting data does not tie to a model class and I would prefer to use some sort of framework to build the queries, not simple string concatenation. The System.Linq.Dynamic namespace came really close to fulfilling this request, but it uses a database model. I realize this is crazy, but I was just curious.

    Read the article

  • New article available in "SOA Suite Essentials for WLI Users" series: Dynamic Data Lookup in a Busin

    - by simone.geib
    It is my pleasure to announce the publishing of another article in our "SOA Suite Essentials for WLI Users" series: "Dynamic Data Lookup in a Business Process: Meta Data Cache Control in Oracle WebLogic Integration and Domain Value Maps in SOA Suite". This article explains how dynamic data can be retrieved in a business process using Domain Value Maps in SOA Suite and shows the similarities to the WLI XML MetaData Cache Control. Lots of customers have asked about this comparison and I hope they will find it useful. The article follows "Setting Web Service and JCA Adapter Endpoints Dynamically in Oracle SOA Suite" which describes how web services and JCA adapter endpoints in SOA Suite can be changed at run-time, and so completes the use case where a BPEL process writes to a file (via file adapter) and the output directory and the file name are set dynamically. Please let me know what you think about the series and this specific article.

    Read the article

  • Agent versus Agentless management

    - by Owen Allen
    I got a couple of questions about Agentless asset management: "What does agentless management do for an asset?" Agentless management is one of the two ways that you can manage an operating system. Rather than installing an Agent Controller on the OS, agentless management uses SSH to regularly check the system and gather monitoring data. Many of the actions that would be available on an agent-managed system are available on an agentless system, but actions such as running reports or updating an Oracle Solaris 10 or Linux OS are not available. A table showing the capabilities of agentless management is here. "What permissions does agentless management require?" Agentless management still requires root credentials. If you can't log into the system as root, you can provide one set of credentials for the login, and then a set of root credentials to switch to.

    Read the article

  • Product News: Oracle Unveils a Waste Management Solution for the Oracle E-Business Suite

    - by Evelyn Neumayr
    Oracle recently announced a new product to help organizations reduce the cost and compliance with international hazmat (short for hazardous materials) and recycling and environmental protection laws. This new waste management solution for Oracle E-Business Suite extends the capabilities of  Oracle Depot Repair, Oracle Transportation Management and Oracle Global Trade Management. It automates and monitors waste management processes to help ensure that hazardous materials are tracked and handled in accordance with regulatory requirements. Oracle’s waste management solution for the Oracle E-Business Suite leverages Oracle Transportation Management and Oracle Global Trade Management, enabling customers to view in-transit inventory across the extended supply chain, while also providing a single repository for all legal, regulatory and compliance related information. Read here for more information.

    Read the article

  • How to mock a dynamic object

    - by Daniel Cazzulino
    Someone asked me how to mock a dynamic object with Moq, which might be non-obvious. Given the following interface definition: public interface IProject { string Name { get; } dynamic Data { get; } } When you try to setup the mock for the dynamic property values, you get:   What’s important to realize is that a dynamic object is just a plain object, whose properties happen to be resolved at runtime. Kinda like reflection, if you will: all public properties of whatever object happens to be the instance, will be resolved just fine at runtime. Therefore, one way to mock this dynamic is to just create an anonymous type with the properties we want, and set the dynamic property to return that:...Read full article

    Read the article

  • I've inherited 200K lines of spaghetti code -- what now?

    - by kmote
    I hope this isn't too general of a question; I could really use some seasoned advice. I am newly employed as the sole "SW Engineer" in a fairly small shop of scientists who have spent the last 10-20 years cobbling together a vast code base. (It was written in a virtually obsolete language: G2 -- think Pascal with graphics). The program itself is a physical model of a complex chemical processing plant; the team that wrote it have incredibly deep domain knowledge but little or no formal training in programming fundamentals. They've recently learned some hard lessons about the consequences of non-existant configuration management. Their maintenance efforts are also greatly hampered by the vast accumulation of undocumented "sludge" in the code itself. I will spare you the "politics" of the situation (there's always politics!), but suffice to say, there is not a consensus of opinion about what is needed for the path ahead. They have asked me to begin presenting to the team some of the principles of modern software development. They want me to introduce some of the industry-standard practices and strategies regarding coding conventions, lifecycle management, high-level design patterns, and source control. Frankly, it's a fairly daunting task and I'm not sure where to begin. Initially, I'm inclined to tutor them in some of the central concepts of The Pragmatic Programmer, or Fowler's Refactoring ("Code Smells", etc). I also hope to introduce a number of Agile methodologies. But ultimately, to be effective, I think I'm going to need to hone in on 5-7 core fundamentals; in other words, what are the most important principles or practices that they can realistically start implementing that will give them the most "bang for the buck". So that's my question: What would you include in your list of the most effective strategies to help straighten out the spaghetti (and prevent it in the future)?

    Read the article

  • Project Management Helps AmeriCares Deliver International Aid

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss Handle with Care Sound project management helps AmeriCares bring international aid to those in need. The stakes are always high for AmeriCares. On a mission to restore health and save lives during times of disaster, the nonprofit international relief and humanitarian aid organization delivers donated medicines, medical supplies, and humanitarian aid to people in the U.S. and around the globe. Founded in 1982 with the express mission of responding as quickly and efficiently as possible to help people in need, the Stamford, Connecticut-based AmeriCares has delivered more than US$10.5 billion in aid to 147 countries over the past three decades. Launch the Slideshow “It’s critically important to us that we steward all the donations and that the medical supplies and medicines get to people as quickly as possible with no loss,” says Kate Sears, senior vice president for finance and technology at AmeriCares. “Whether we’re shipping IV solutions to victims of cholera in Haiti or antibiotics to Somali famine victims, we need to get the medicines there sooner because it means more people will be helped and lives improved or even saved.” Ten years ago, the tracking systems used by AmeriCares associates were paper-based. In recent years, staff started using spreadsheets, but the tracking processes were not standardized between teams. “Every team was tracking completely different information,” says Megan McDermott, senior associate, Sub-Saharan Africa partnerships, at AmeriCares. “It was just a few key things. For example, we tracked the date a shipment was supposed to arrive and the date we got reports from our partner that a hospital received aid on their end.” While the data was accurate, much detail was being lost in the process. AmeriCares management knew it could do a better job of tracking this enterprise data and in 2011 took a significant step by implementing Oracle’s Primavera P6 Professional Project Management. “It’s a comprehensive solution that has helped us improve the monitoring and controlling processes. It has allowed us to do our distribution better,” says Sears. In addition, the implementation effort has been a change agent, helping AmeriCares leadership rethink project management across the entire organization. Initially, much of the focus was on standardizing processes, but staff members also learned the importance of thinking proactively to prevent possible problems and evaluating results to determine if goals and objectives are truly being met. Such data about process efficiency and overall results is critical not only to AmeriCares staff but also to the donors supporting the organization’s life-saving missions. Efficiency Saves Lives One of AmeriCares’ core operations is to gather product donations from the private sector, establish where the most-urgent needs are, and solicit monetary support to send the aid via ocean cargo or airlift to welfare- and health-oriented nongovernmental organizations, hospitals, health networks, and government ministries based in areas in need. In 2011 alone, AmeriCares sent more than 3,500 shipments to 95 countries in response to both ongoing humanitarian needs and more than two dozen emergencies, including deadly tornadoes and storms in the U.S. and the devastating tsunami in Japan. When it comes to nonprofits in general, donors want to know that the charitable organizations they support are using funds wisely. Typically, nonprofits are evaluated by donors in terms of efficiency, an area where AmeriCares has an excellent reputation: 98 percent of expenses go directly to supporting programs and less than 2 percent represent administrative and fundraising costs. Donors, however, should look at more than simple efficiency, says Peter York, senior partner and chief research and learning officer at TCC Group, a nonprofit consultancy headquartered in New York, New York. They should also look at whether organizations have the systems in place to sustain their missions and continue to thrive. An expert on nonprofit organizational management, York has spent years studying sustainable charitable organizations. He defines them as nonprofits that are able to achieve the ongoing financial support to stay relevant and continue doing core mission work. In his analysis of well over 2,500 larger nonprofits, York has found that many are not sustaining, and are actually scaling back in size. “One of the biggest challenges of nonprofit sustainability is the general public’s perception that every dollar donated has to go only to the delivery of service,” says York. “What our data shows is that there are some fundamental capacities that have to be there in order for organizations to sustain and grow.” York’s research highlights the importance of data-driven leadership at successful nonprofits. “You’ve got to have the tools, the systems, and the technologies to get objective information on what you do, the people you serve, and the results you’re achieving,” says York. “If leaders don’t have the knowledge and the data, they can’t make the strategic decisions about programs to take organizations to the next level.” Historically, AmeriCares associates have used time-tested and cost-effective strategies to ship and then track supplies from donation to delivery to their destinations in designated time frames. When disaster strikes, AmeriCares ships by air and generally pulls out all the stops to deliver the most urgently needed aid within the first few days and weeks. Then, as situations stabilize, AmeriCares turns to delivering sea containers for the postemergency and ongoing aid so often needed over the long term. According to McDermott, getting a shipment out the door is fairly complicated, requiring as many as five different AmeriCares teams collaborating together. The entire process can take months—from when products are received in the warehouse and deciding which recipients to allocate supplies to, to getting customs and governmental approvals in place, actually shipping products, and finally ensuring that the products are received in-country. Delivering that aid is no small affair. “Our volume exceeds half a billion dollars a year worth of donated medicines and medical supplies, so it’s a sizable logistical operation to bring these products in and get them out to the right place quickly to have the most impact,” says Sears. “We really pride ourselves on our controls and efficiencies.” Adding to that complexity is the fact that the longer it takes to deliver aid, the more dire the human need can be. Any time AmeriCares associates can shave off the complicated aid delivery process can translate into lives saved. “It’s really being able to track information consistently that will help us to see where are the bottlenecks and where can we work on improving our processes,” says McDermott. Setting a Standard Productivity and information management improvements were key objectives for AmeriCares when staff began the process of implementing Oracle’s Primavera solution. But before configuring the software, the staff needed to take the time to analyze the systems already in place. According to Greg Loop, manager of database systems at AmeriCares, the organization received guidance from several consultants, including Rich D’Addario, consulting project manager in the Primavera Global Business Unit at Oracle, who was instrumental in shepherding the critical requirements-gathering phase. D’Addario encouraged staff to begin documenting shipping processes by considering the order in which activities occur and which ones are dependent on others to get accomplished. This exercise helped everyone realize that to be more efficient, they needed to keep track of shipments in a more standard way. “The staff didn’t recognize formal project management methodology,” says D’Addario. “But they did understand what the most important things are and that if they go wrong, an entire project can go off course.” Before, if a boatload of supplies was being sent to Haiti and there was a problem somewhere, a lot of time was taken up finding out where the problem was—because staff was not tracking things in a standard way. As a result, even more time was needed to find possible solutions to the problem and alert recipients that the aid might be delayed. “For everyone to put on the project manager hat and standardize the way every single thing is done means that now the whole organization is on the same page as to what needs to occur from the time a hurricane hits Haiti and when a boat pulls in to unload supplies,” says D’Addario. With so much care taken to put a process foundation firmly in place, configuring the Primavera solution was actually quite simple. Specific templates were set up for different types of shipments, and dashboards were implemented to provide executives with clear overviews of every project in the system. AmeriCares’ Loop reports that system planning, refining, and testing, followed by writing up documentation and training, took approximately four months. The system went live in spring 2011 at AmeriCares’ Connecticut headquarters. While the nonprofit has an international presence, with warehouses in Europe and offices in Haiti, India, Japan, and Sri Lanka, most donated medicines come from U.S. entities and are shipped from the U.S. out to the rest of the world. In addition, all shipments are tracked from the U.S. office. AmeriCares doesn’t expect the Primavera system to take months off the shipping time, especially for sea containers. However, any time saved is still important because it will allow aid to be delivered to people more quickly at a lower overall cost. “If we can trim a day or two here or there, that can translate into lives that we’re saving, especially in emergency situations,” says Sears. A Cultural Change Beyond the measurable benefits that come with IT-driven process improvement, AmeriCares management is seeing a change in culture as a result of the Primavera project. One change has been treating every shipment of aid as a project, and everyone involved with facilitating shipments as a project manager. “This is a revolutionary concept for us,” says McDermott. “Before, we were used to thinking we were doing logistics—getting a container from point A to point B without looking at it as one project and really understanding what it meant to manage it.” AmeriCares staff is also happy to report that collaboration within the organization is much more efficient. When someone creates a shipment in the Primavera system, the same shared template is used, which means anyone can log in to the system to see the status of a shipment. Knowledgeable staff can access a shipment project to help troubleshoot a problem. Management can easily check the status of projects across the organization. “Dashboards are really useful,” says McDermott. “Instead of going into the details of each project, you can just see the high-level real-time information at a glance.” The new system is helping team members focus on proactively managing shipments rather than simply reacting when problems occur. For example, when a container is shipped, documents must be included for customs clearance. Now, the shipping template has built-in reminders to prompt team members to ask for copies of these documents from freight forwarders and to follow up with partners to discover if a shipment is on time. In the past, staff may not have worked on securing these documents until they’d been notified a shipment had arrived in-country. Another benefit of capturing and adopting best practices within the Primavera system is that staff training is easier. “Capturing the processes in documented steps and milestones allows us to teach new staff members how to do their jobs faster,” says Sears. “It provides them with the knowledge of their predecessors so they don’t have to keep reinventing the wheel.” With the Primavera system already generating positive results, management is eager to take advantage of advanced capabilities. Loop is working on integrating the company’s proprietary inventory management system with the Primavera system so that when logistics or warehousing operators input data, the information will automatically go into the Primavera system. In the past, this information had to be manually keyed into spreadsheets, often leading to errors. Mining Historical Data Another feature on the horizon for AmeriCares is utilizing Primavera P6 Professional Project Management reporting capabilities. As the system begins to include more historical data, management soon will be able to draw on this information to conduct analysis that has not been possible before and create customized reports. For example, at the beginning of the shipment process, staff will be able to use historical data to more accurately estimate how long the approval process should take for a particular country. This could help ensure that food and medicine with limited shelf lives do not get stuck in customs or used beyond their expiration dates. The historical data in the Primavera system will also help AmeriCares with better planning year to year. The nonprofit’s staff has always put together a plan at the beginning of the year, but this has been very challenging simply because it is impossible to predict disasters. Now, management will be able to look at historical data and see trends and statistics as they set current objectives and prepare for future need. In addition, this historical data will provide AmeriCares management with the ability to review year-end data and compare actual project results with goals set at the beginning of the year—to see if desired outcomes were achieved and if there are areas that need improvement. It’s this type of information that is so valuable to donors. And, according to York, project management software can play a critical role in generating the data to help nonprofits sustain and grow. “It is important to invest in systems to help replicate, expand, and deliver services,” says York. “Project management software can help because it encourages nonprofits to examine program or service changes and how to manage moving forward.” Sears believes that AmeriCares donors will support the return on investment the organization will achieve with the Primavera solution. “It won’t be financial returns, but rather how many more people we can help for a given dollar or how much more quickly we can respond to a need,” says Sears. “I think donors are receptive to such arguments.” And for AmeriCares, it is all about the future and increasing results. The project management environment currently may be quite simple, but IT staff plans to expand the complexity and functionality as the organization grows in its knowledge of project management and the goals it wants to achieve. “As we use the system over time, we’ll continue to refine our best practices and accumulate more data,” says Sears. “It will advance our ability to make better data-driven decisions.”

    Read the article

  • EC2 hosted service multi-tenant dynamic DNS solution

    - by accidental admin
    I want to change the model of my EC2 hosted service to have a separate sub domain for each tenant (ie. .example.com). My primary DNS is now with dnsmadeeasy.com, but their dynamic DNS offering seem pretty weak: it requires the API to use my full dnsmadeeasy.com account credentials, I rather have the API use a limited privilege credential that can only add/remove/modify subdomain records from what I gather it only allows to modify existing records, does not allow me to dynamically add/remove records for new tenant subdomains My question what are my alternatives? Is there something in the dnsmadeeasy API offering I misunderstood and I should just use them? Is there some other similar DNS service that has a DDNS offering that satisfies my requirements? Or should I just bite the bullet and host my own DNS (my fear is not configuration/learning/know how, my fear is reliability). If you recommend the latter, can you detail the necessary steps or a link to a good tutorial how to?

    Read the article

  • dynamic routing between openvpn tunnels

    - by pQd
    i'm thinking about using dynamic routing [ OSPF or RIP ] via OpenVPN tunnels. right now i have few offices connected in full mesh, but this is not scalable solution as we add more locations. i would like to avoid situation when plenty of internal traffic is affected if one of two vpn termination points that i plan to use is down. do you have similar configuration working in production? if so - what routing daemon did you use - quagga? something else? did you encounter any problems? thanks!

    Read the article

  • Centralized Windows/Mac Patch Management that is easy to use

    - by BiggsTRC
    I'm looking for advice on what patch management solutions you would recommend based upon your experience. I'm also looking for which ones you would not recommend based upon your experience. We have a mixed network of Windows and Mac clients. Our central servers are all Windows servers, although I have considered putting in a Mac server to better handle our Mac clients. The issue we are facing currently is that we need to maintain the patches on all of our third-party applications. Right now we use WSUS, which handles with patching of Windows and some Microsoft products but that is about it. I need something to cover the other applications, specifically things like Adobe products (Reader, Flash, Dreamweaver, etc.) Our network isn't that big (maybe 200 clients) and I don't have a person to dedicate just to patching and maintaining a patch management solution. Thus very large and complicated solutions like System Center are most likely out. I have recently been looking at Dell's Kace K1000 solution (http://www.kace.com/products/systems-management-appliance/). It seems simple and it provides a lot of tools in one package that I would like/need as well. I like the fact that it is self-contained in an appliance and that it is designed for solutions like mine. However, I'm not sure if this is the best solution. I've also looked some at Shavlik's Netchk solution (http://www.shavlik.com/netchk-protect.aspx) but I don't need an anti-virus product. However, it looks like they might have a very good patch database. My question is this: What are your thoughts on these to products? Are there better products out there? Are there issues that I'm not considering? I want something that is very good at patching a broad range of products, that is simple to use, that takes a minimal amount of management (like WSUS), and that (hopefully) works with Mac and Windows.

    Read the article

  • Godaddy hosting with dynamic ip?

    - by Razor Storm
    I bought a domain name on godaddy.com let's call it www.website.com, but my server has a dynamic ip, how do I resolve this issue? What I've tried: get a dns account at dyndns.com let's call it dns.dyndns.com, so I set up godaddy to forward the domain to dns.dyndns.com. problem: When the user goes to www.website.com, they see dns.dyndns.com in the url bar. Bad So then I set up the forwarding on godaddy to forward with domain masking, but then the problem is now that users can't even see url paths/get queries/ or hash marks anymore! `www.website.com/folder/search?id=4#ajax=4` becomes just www.website.com in the url bar. bad! What do I do?

    Read the article

  • Dual boot Windows 7 with Windows 8- Dynamic Disk

    - by MeetM
    Its a long explanation. I have a HP Pavilion dm4 notebook. It has pre installed Windows 7 Home Pre. Recently, I tried to install Windows 8 developer preview on my notebook, but while installing, it only allowed me to insatll it on my primary Windows 7 drive I.e. drive C. I had kept 1 empty partition for Windows 8 but when I selectced that option, the next button at the bottom of the window just grey with some error saying Windows cannot boot from this drive....blah blah blah So I googled and found another way of doing it by VHD(virtual hard disk). This seemed to work but on restarting gave me "VHD_BOOT_INITIALIZATION_FAILED" error. After trying all possible ways for around 10 times, I gave up. I noticed that d only thing difference in d tutorials and my notebook is the Disk type. They all had Basic and I have Dynamic. Is that the reason m not able to boot Windows 8? Any suggestions?

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >