Search Results

Search found 2667 results on 107 pages for 'peopletools strategy'.

Page 78/107 | < Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >

  • How should I synchronize lists in WCF?

    - by mafutrct
    I've got a WCF service that offers multiple lists of items of different types. The lists can be changed on the server. Every change has to be published to all clients to make sure every client has an up-to-date copy of each server list. Currently I'm using this strategy: On login, every client receives the current status of every list. On every change, the added or removed item is sent to all clients. The drawback is that I have to create a callback for every list, since the items are of different types. Is there a pattern I could apply? Or do I really have to duplicate code for each of the lists?

    Read the article

  • JavaScript: how to use data but to hide it so as it cannot be reused

    - by loukote
    Hi all. I've some data that i'd like to publish just on one website, ie. it should not be reused on other websites. The data is a set of numbers that change every day, our journalists work to get hard gather it. Is there any way to hide, crypt, etc. the data in a way that it cannot be reused by others? But to show it in a graph in the same time? I found the ASCII to HEX tool that could be used for (http://utenti.multimania.it/ascii2hex/). I wonder if you can suggest other ways. (Even if I have to completely change the strategy.) Many thanks!

    Read the article

  • LRU LinkedHashMap that limits size based on available memory

    - by sanity
    I want to create a LinkedHashMap which will limit its size based on available memory (ie. when freeMemory + (maxMemory - allocatedMemory) gets below a certain threshold). This will be used as a form of cache, probably using "least recently used" as a caching strategy. My concern though is that allocatedMemory also includes (I assume) un-garbage collected data, and thus will over-estimate the amount of used memory. I'm concerned about the unintended consequences this might have. For example, the LinkedHashMap may keep deleting items because it thinks there isn't enough free memory, but the free memory doesn't increase because these deleted items aren't being garbage collected immediately. Does anyone have any experience with this type of thing? Is my concern warranted? If so, can anyone suggest a good approach? I should add that I also want to be able to "lock" the cache, basically saying "ok, from now on don't delete anything because of memory usage issues".

    Read the article

  • Getting age from an encrypted DOB field

    - by Mailforbiz
    Hi all Due to certain compliance requirements, we have to encrypt the user DOB field in the database. We also have another requirement to be able to search a user by his age. Our DB doesn't support transparent encryption so encryption will handled by the application. Any good ideas on how to allow for searching by age? One thought is to save the YOB in a separate column in cleartext and still be able to comply to our compliance requirement. Aside from that, any other design strategy that would help? Thanks in advance!

    Read the article

  • JPA GeneratedValue with GenerationType.TABLE does a big jump after jvm restart

    - by joeduardo
    When I start my server and add an entry, the generated id will start with 1, 2, so on and so forth. After a restart, adding an entry would generate an id like 32,xxx. Another restart and adding of entry would generate an id like 65,xxx. I don't know why this is happening. Here's a snippet of the annotation I'm using for my id. I'm using Hibernate. @Id @GeneratedValue(strategy = GenerationType.TABLE) private Long id;

    Read the article

  • What non-standard behaviour features does Gmail exhibit, when it is programmatically used as a POP3 server?

    - by Mike Green
    I am trying to prepare a complete list of behaviour that Gmail POP3 exhibits, that you wouldn’t expect to generally find in a POP3 server. For example, Gmail appears to ignore the DELE (delete) command from a POP3 client. Instead, it implements its own delete and archive strategy. The purpose of preparing a list is to avoid developers testing a POP3 client against the Gmail POP3 server and then assuming that all POP3 servers behave in the same way. Can anyone provide a more complete list of non-standard behaviour?

    Read the article

  • parsing position files in ruby

    - by john
    I have a sample position file like below. 789754654 COLORA SOMETHING1 19370119FYY076 2342423234SS323423 742784897 COLORB SOMETHING2 20060722FYY076 2342342342SDFSD3423 I am interested in positions 54-61 (4th column). I want to change the date to be a different format. So final outcome will be: 789754654 COLORA SOMETHING1 01191937FYY076 2342423234SS323423 742784897 COLORB SOMETHING2 07222006FYY076 2342342342SDFSD3423 The columns are seperated by spaces not tabs. And the final file should have exact number of spaces as the original file....only thing changing should be the date format. How can I do this? I wrote a script but it will lose the original spaces and positioning will be messed up. file.each_line do |line| dob = line.split(" ") puts dob[3] #got the date. change its format 5.times { puts "**" } end Can anyone suggest a better strategy so that positioning in the original file remains the same?

    Read the article

  • Imposing email limits on web page

    - by Martin
    To avoid spammers, what's a good strategy for imposing limits on users when sending email from our site? A count limit per day on individual IPs? Sender emails? Domains? In general terms, but recommended figures will also be helpful. Our users can send emails through our web page. They can register and log in but are also allowed to do this without logging in, but with a captcha and with a field for the senders email. Certainly, there is a header, "The user has sent you the following message.", limiting the use for spammers, so perhaps it's not a big problem. Any comments on what I'm doing will be greatly appreciated.

    Read the article

  • How to instantiate objects of classes that have dependencies injected?

    - by chester89
    Let's say I have some class with dependency injected: public class SomeBusinessCaller { ILogger logger; public SomeBusinessCaller(ILogger logger) { this.logger = logger; } } My question is, how do I instantiate an object of that class? Let's say I have an implementation for this, called AppLogger. After I say ObjectFactory.For<ILogger>().Use<AppLogger>(); how do I call constructor of SomeBusinessCaller? Am I calling SomeBusinessCaller caller = ObjectFactory.GetInstance<SomeBusinessCaller>(); or there is a different strategy for that?

    Read the article

  • What GC parameters is a JVM running with?

    - by skaffman
    I'm still investigating issues I have with GC tuning (see prior question), which involves lots of reading and experimentation. Sun Java5+ JVMs attempt to automatically select the optimal GC strategy and parameters based on their environment, which is great, but I can't figure out how to query the running JVM to find out what those parameters are. Ideally, I'd like to see what values of the various GC-related -XX options are being used, as selected automatically by the VM. If I had that, I could have a baseline to begin tweaking. Anyone know to recover these values from a running VM?

    Read the article

  • What is the role of `while`-loops in computation expressions in F#?

    - by MizardX
    If you define a While method of the builder-object, you can use while-loops in your computation expressions. The signature of the While method is: member b.While (predicate:unit->bool, body:M<'a>) : M<'a> For comparison, the signature of the For method is: member b.For (items:seq<'a>, body:unit->M<'a>) : M<'a> You should notice that, in the While-method, the body is a simple type, and not a function as in the For method. You can embed some other statements, like let and function-calls inside your computation-expressions, but those can impossibly execute in a while-loop more than once. builder { while foo() do printfn "step" yield bar() } Why is the while-loop not executed more than once, but merely repeated? Why the significant difference from for-loops? Better yet, is there some intended strategy for using while-loops in computation-expressions?

    Read the article

  • jQuery/JavaScript: Trigger when preloaded

    - by user317563
    Hello there, jQuery has the .ready() function, but I am unsure whether this is what I need: I set a default background image in CSS (imange 1 out of 4), once document is loaded (images and all, not only DOM); I want to start preloading background image 2 out of 4. Once that is loaded, I want to fade image 1 into image 2. Then I want to preload the next image (3 out of 4), once that is loaded, I want to fade from background image 2 into image 3, and finally preload image 4. Once image 4 is loaded, i would like to fade image 3 into image 4. After that, I want to cycle between the images with a set time interval. What strategy would I use to solve this? .load() function? Thank you for your time. Kind regards,Marius

    Read the article

  • executorservice to read data from database in chuncks and run process on them

    - by TazMan
    I'm trying to write a process that would read data from a database and upload it onto a cloud datastore. How can I decide the partition strategy of the data? I want to query the table in chunks and process each chunk in 10 threads. Each thread basically will send the data to an individual node on a 10 node cluster on the cloud.. Where in the below multi threading code will the dataquery to extract and send 10 concurrent requests for uploading data to cloud would be? public class Caller { public static void main(String[] args) { ExecutorService executor = Executors.newFixedThreadPool(10); for (int i = 0; i < 10; i++) { Runnable worker = new DomainCDCProcessor(i); executor.execute(worker); } executor.shutdown(); while (!executor.isTerminated()) { } System.out.println("Finished all threads"); } }

    Read the article

  • What are the merits of using the various VCS (Version Control Systems) that exist to track Drupal pr

    - by ZoFreX
    I'm trying to find the best version control strategy for my workflow with Drupal. I have multiple sites per install and I'm pretty sure I'll have a separate repository for each website. As far as I know, the VCSs worth considering are: SVN Bazaar (bzr) Git Mercurial (hg) I know how these compare to each other in general, but want to learn their merits/demerits for Drupal. If you're using (or have used) any of these for Drupal: What is your setup? What about the VCS you chose works well for managing Drupal projects? What doesn't?

    Read the article

  • Legalities of programming companion Android apps

    - by honeal
    I'm interested in creating companion apps to several current Android apps and was curious if there is a legal issue with using their name and/or icons from the app. Like the companion app being called Angry Birds Companion or something and you were to use a picture of the level or one of the characters, etc (I'm simply pulling from thin air so don't judge the idea, just the question, please). I know there are Strategy guides to video games that use icons and names, but I'm assuming they have prior consent. Does anyone have any factual input on this?

    Read the article

  • perforce implementation of clearcase like "views"

    - by Pradyot
    I have read through the documentation on perforce and the "branching strategy" advice as well. One thing thats left me baffled, is how a simple concern is does not seem to adequtely adressed. When I am working on a project that touches many parts of our code base , I cannot checkin my code at the end of the day without checking into the trunk. So do I need to branch in this situation? I want to be able to have the ability to have a history of my changes in a long and hard project, so I can go back when I I make a wrong turn.. The problem with branching I see is that I will be creating copies of almost the entire codebase .. Am I missing an obvious solution here? thanks

    Read the article

  • Gradient boosting predictions in low-latency production environments?

    - by lockedoff
    Can anyone recommend a strategy for making predictions using a gradient boosting model in the <10-15ms range (the faster the better)? I have been using R's gbm package, but the first prediction takes ~50ms (subsequent vectorized predictions average to 1ms, so there appears to be overhead, perhaps in the call to the C++ library). As a guideline, there will be ~10-50 inputs and ~50-500 trees. The task is classification and I need access to predicted probabilities. I know there are a lot of libraries out there, but I've had little luck finding information even on rough prediction times for them. The training will happen offline, so only predictions need to be fast -- also, predictions may come from a piece of code / library that is completely separate from whatever does the training (as long as there is a common format for representing the trees).

    Read the article

  • What does "Only catch exceptions you can handle" really mean?

    - by KyleLib
    I'm tasked with writing an Exception Handling Strategy and Guidelines document for a .NET/C# project I'm working on. I'm having a tough go at it. There's plenty of information available for how/when to throw, catch, wrap exceptions, but I'm looking for describing what sorts of things should go on inside the catch block short of wrapping and throwing the exception. try { DoSomethingNotNice(); } catch (ExceptionICanHandle ex) { //Looking for examples of what people are doing in catch blocks //other than throw or wrapping the exception, and throwing. } Thanks in advance

    Read the article

  • Using linked servers, OPENROWSET and OPENQUERY

    - by BuckWoody
    SQL Server has a few mechanisms to reach out to another server (even another server type) and query data from within a Transact-SQL statement. Among them are a set of stored credentials and information (called a Linked Server), a statement that uses a linked server called called OPENQUERY, another called OPENROWSET, and one called OPENDATASOURCE. This post isn’t about those particular functions or statements – hit the links for more if you’re new to those topics. I’m actually more concerned about where I see these used than the particular method. In many cases, a Linked server isn’t another Relational Database Management System (RDMBS) like Oracle or DB2 (which is possible with a linked server), but another SQL Server. My concern is that linked servers are the new Data Transformation Services (DTS) from SQL Server 2000 – something that was designed for one purpose but which is being morphed into something much more. In the case of DTS, most of us turned that feature into a full-fledged job system. What was designed as a simple data import and export system has been pressed into service doing logic, routing and timing. And of course we all know how painful it was to move off of a complex DTS system onto SQL Server Integration Services. In the case of linked servers, what should be used as a method of running a simple query or two on another server where you have occasional connection or need a quick import of a small data set is morphing into a full federation strategy. In some cases I’ve seen a complex web of linked servers, and when credentials, names or anything else changes there are huge problems. Now don’t get me wrong – linked servers and other forms of distributing queries is a fantastic set of tools that we have to move data around. I’m just saying that when you start having lots of workarounds and when things get really complicated, you might want to step back a little and ask if there’s a better way. Are you able to tolerate some latency? Perhaps you’re able to use Service Broker. Would you like to be platform-independent on the data source? Perhaps a middle-tier might make more sense, abstracting the queries there and sending them to the proper server. Designed properly, I’ve seen these systems scale further and be more resilient than loading up on linked servers. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Serious about Embedded: Java Embedded @ JavaOne 2012

    - by terrencebarr
    It bears repeating: More than ever, the Java platform is the best technology for many embedded use cases. Java’s platform independence, high level of functionality, security, and developer productivity address the key pain points in building embedded solutions. Transitioning from 16 to 32 bit or even 64 bit? Need to support multiple architectures and operating systems with a single code base? Want to scale on multi-core systems? Require a proven security model? Dynamically deploy and manage software on your devices? Cut time to market by leveraging code, expertise, and tools from a large developer ecosystem? Looking for back-end services, integration, and management? The Java platform has got you covered. Java already powers around 10 billion devices worldwide, with traditional desktops and servers being only a small portion of that. And the ‘Internet of Things‘ is just really starting to explode … it is estimated that within five years, intelligent and connected embedded devices will outnumber desktops and mobile phones combined, and will generate the majority of the traffic on the Internet. Is your platform and services strategy ready for the coming disruptions and opportunities? It should come as no surprise that Oracle is keenly focused on Java for Embedded. At JavaOne 2012 San Francisco the dedicated track for Java ME, Java Card, and Embedded keeps growing, with 52 sessions, tutorials, Hands-on-Labs, and BOFs scheduled for this track alone, plus keynotes, demos, booths, and a variety of other embedded content. To further prove Oracle’s commitment, in 2012 for the first time there will be a dedicated sub-conference focused on the business aspects of embedded Java: Java Embedded @ JavaOne. This conference will run for two days in parallel to JavaOne in San Francisco, will have its own business-oriented track and content, and targets C-level executives, architects, business leaders, and decision makers. Registration and Call For Papers for Java Embedded @ JavaOne are now live. We expect a lot of interest in this new event and space is limited, so be sure to submit your paper and register soon. Hope to see you there! Cheers, – Terrence Filed under: Mobile & Embedded Tagged: ARM, Call for Papers, Embedded Java, Java Embedded, Java Embedded @ JavaOne, Java ME, Java SE Embedded, Java SE for Embedded, JavaOne San Francisco, PowerPC

    Read the article

  • The Importance of a Security Assessment - by Michael Terra, Oracle

    - by Darin Pendergraft
    Today's Blog was written by Michael Terra, who was the Subject Matter Expert for the recently announced Oracle Online Security Assessment. You can take the Online Assessment here: Take the Online Assessment Over the past decade, IT Security has become a recognized and respected Business discipline.  Several factors have contributed to IT Security becoming a core business and organizational enabler including, but not limited to, increased external threats and increased regulatory pressure. Security is also viewed as a key enabler for strategic corporate activities such as mergers and acquisitions.Now, the challenge for senior security professionals is to develop an ongoing dialogue within their organizations about the importance of information security and how it can impact their organization's strategic objectives/mission. The importance of conducting regular “Security Assessments” across the IT and physical infrastructure has become increasingly important. Security standards and frameworks, such as the international standard ISO 27001, are increasingly being adopted by organizations and their business partners as proof of their security posture and “Security Assessments” are a great way to ensure a continued alignment to these frameworks.Oracle offers a number of different security assessment covering a broad range of technologies. Some of these are short engagements conducted for free with our strategic customers and partners. Others are longer term paid engagements delivered by Oracle Consulting Services or one of our partners. The goal of a security assessment, (also known as a security audit or security review), is to ensure that necessary security controls are integrated into the design and implementation of a project, application or technology.  A properly completed security assessment should provide documentation outlining any security gaps that exist in an infrastructure and the associated risks for those gaps. With that knowledge, an organization can choose to either mitigate, transfer, avoid or accept the risk. One example of an Oracle offering is a Security Readiness Assessment:The Oracle Security Readiness Assessment is a practical security architecture review focused on aligning an organization’s enterprise security architecture to their business principals and strategic objectives. The service will establish a multi-phase security architecture roadmap focused on supporting new and existing business initiatives.Offering OverviewThe Security Readiness Assessment will: Define an organization’s current security posture and provide a roadmap to a desired future state architecture by mapping  security solutions to business goals Incorporate commonly accepted security architecture concepts to streamline an organization’s security vision from strategy to implementation Define the people, process and technology implications of the desired future state architecture The objective is to deliver cohesive, best practice security architectures spanning multiple domains that are unique and specific to the context of your organization. Offering DetailsThe Oracle Security Readiness Assessment is a multi-stage process with a dedicated Oracle Security team supporting your organization.  During the course of this free engagement, the team will focus on the following: Review your current business operating model and supporting IT security structures and processes Partner with your organization to establish a future state security architecture leveraging Oracle’s reference architectures, capability maps, and best practices Provide guidance and recommendations on governance practices for the rollout and adoption of your future state security architecture Create an initial business case for the adoption of the future state security architecture If you are interested in finding out more, ask your Sales Consultant or Account Manager for details.

    Read the article

  • Learn Cloud Computing – It’s Time

    - by Ben Griswold
    Last week, I gave an in-house presentation on cloud computing.  I walked through an overview of cloud computing – characteristics (on demand, elastic, fully managed by provider), why are we interested (virtualization, distributed computing, increased access to high-speed internet, weak economy), various types (public, private, virtual private cloud) and services models (IaaS, PaaS, SaaS.)  Though numerous providers have emerged in the cloud computing space, the presentation focused on Amazon, Google and Microsoft offerings and provided an overview of their platforms, costs, data tier technologies, management and security.  One of the biggest talking points was why developers should consider the cloud as part of their deployment strategy: You only have to pay for what you consume You will be well-positioned for one time event provisioning You will reap the benefits of automated growth and scalable technologies For the record: having deployed dozens of applications on various platforms over the years, pricing tends to be the biggest customer concern.  Yes, scalability is a customer consideration, too, but it comes in distant second.  Boy do I hope you’re still reading… You may be thinking, “Cloud computing is well and good and it sounds catchy, but should I bother?  After all, it’s just another technology bundle which I’m supposed to ramp up on because it’s the latest thing, right?”  Well, my clients used to be 100% reliant upon me to find adequate hosting for them.  Now I find they are often aware of cloud services and some come to me with the “possibility” that deploying to the cloud is the best solution for them.  It’s like the patient who walks into the doctor’s office with their diagnosis and treatment already in mind thanks to the handful of Internet searches they performed earlier that day.  You know what?  The customer may be correct about the cloud. It may be a perfect fit for their app.  But maybe not…  I don’t think there’s a need to learn about every technical thing under the sun, but if you are responsible for identifying hosting solutions for your customers, it is time to get up to speed on cloud computing and the various offerings (if you haven’t already.)  Here are a few references to get you going: DZone Refcardz #82 Getting Started with Cloud Computing by Daniel Rubio Wikipedia Cloud Computing – What is it? Amazon Machine Images (AMI) Google App Engine SDK Azure SDK EC2 Spot Pricing Google App Engine Team Blog Amazon EC2 Team Blog Microsoft Azure Team Blog Amazon EC2 – Cost Calculator Google App Engine – Cost and Billing Resources Microsoft Azure – Cost Calculator Larry Ellison has stated that cloud computing has been defined as "everything that we currently do" and that it will have no effect except to "change the wording on some of our ads" Oracle launches worldwide cloud-computing tour NoSQL Movement  

    Read the article

  • Force.com presents Database.com SQL Azure/Amazon RDS unfazed

    - by Sarang
    At the DreamForce 2010 event in San Francisco Force.com unveiled their next big thing in the Fat SaaS portfolio "Database.com".  I am still wondering how would they would've shelled out for that domain name. Now why would a already established SaaS player foray into a key building block like Database? Potentially allowing enterprises to build apps that do not utilize the Force.com stack! One key reason is being seen as the Fat SaaS player with evey trick in the SaaS space under his belt. You want CRM come hither, want a custom development PaaS like solution welcome home (VMForce), want all your apps to talk to a cloud DB and minimize latency by having it reside closer to you cloud apps? You've come to the right place sire! Other is potentially killing foray of smaller DB players like Oracle (Not surprisingly, the Database.com offering is a highly customized and scalable Oracle database) from entering the lucrative SaaS db marketplace. The feature set promised looks great out of the box for someone who likes to visualize cool new architectures. The ground realities are certainly going to be a lot different considering the SOAP/REST style access patterns in lieu of the comfortable old shoe of SQL. Microsoft suffered heavily with SDS (SQL Data Services) offering in early 2009 and had to pull the plug on the product only to reintroduce as a simple SQL Server in the cloud, SQL Windows Azure. Though MSFT is playing cool by providing OData semantics to work with SQL Windows Azure satisfying atleast some needs of the Web-Style to a DB. The other features like Social data models including Profiles, Status updates, feeds seem interesting as well. (Although I beleive social is just one of the aspects of large scale collaborative computing). All these features start "Free" for devs its a good news but the good news stops here. The overall pricing model of $ per Users per Transactions / Month is highly disproportionate compared to Amazon RDS (Based on MySQL) or SQL Windows Azure (Based on MSSQL). Roger Jennigs of Oakleaf did an interesting comparo based on 3, 10, 100, 500 users and it turns out that Database.com going by current understanding is way too expensive for the services on offer. The offering may not impact the decision for DotNet shops mulling their cloud stategy or even some Java/MySQL shops thinking about Amazon RDS, however for enterprises having already invested in other force.com offerings this could be a very important piece in the cloud strategy jigsaw. One which would address a key cloud DB issue of "Latency" for them at least it will help having the DB in the neighborhood. The tooling and "SQL like" access provider drivers (Think ODBC/JDBC) will be available later this year. Progress Software has already announced their JDBC driver stack for Database.com. It remains to be seen how effective the overall solutions proves to be in the longer run but for starts its a important decision towards consolidating Force.com's already strong positioning in the SaaS space. As always contrasting views are welcome! :)

    Read the article

  • Microsoft Templates included in jQuery 1.5!

    - by Stephen Walther
    When I joined the ASP.NET team as the Program Manager for Ajax, the ASP.NET team was working on releasing a new version of the Microsoft Ajax Library. This new version of the Microsoft Ajax Library had several really innovative and unique features such as support for client templates, client data-binding, script dependency management, and globalization. However, we kept hearing the message that our customers wanted to use jQuery when building ASP.NET applications. Therefore, about ten months ago, we decided to pursue a risky strategy. Scott Guthrie sent me to Cambridge to meet with John Resig – the creator of jQuery and leader of the jQuery project – to find out whether Microsoft and jQuery could work together. We wanted to find out whether the jQuery project would be open to allowing Microsoft to contribute the innovative features that we were developing for the Microsoft Ajax Library -- such as client templates and client data-binding -- to the jQuery library. Fortunately, the Cambridge meeting with Resig went well. John Resig was very open to accepting contributions to the jQuery library. Over the next few months, we worked out a process for Microsoft to contribute new features to the open-source jQuery project. Resig and Guthrie appeared on stage at the MIX10 conference to announce that Microsoft would be contributing features to jQuery. It has been a long journey, but I am happy to report success. Today, Microsoft and the jQuery project have announced that three plugins developed by developers on the ASP.NET team – the jQuery Templates, jQuery Data Link, and jQuery Globalization plugins – have been accepted as official jQuery plugins. In addition, the jQuery Templates plugin will be integrated into jQuery 1.5 which is the next major release of jQuery. You can learn more about the plugins by watching the following Web Camps TV episode hosted by James Senior with Stephen Walther: Web Camps TV #5 - Microsoft Commits Code to jQuery! You can read Scott Guthrie’s blog announcement here: http://weblogs.asp.net/scottgu/archive/2010/10/04/jquery-templates-data-link-and-globalization-accepted-as-official-jquery-plugins.aspx You can read the jQuery team’s announcement here: http://blog.jquery.com/2010/10/04/new-official-jquery-plugins-provide-templating-data-linking-and-globalization/ I wrote the original proposal for the jQuery Templates plugin. Dave Reed and Boris Moore were the ASP.NET developers responsible for actually writing the plugin (with lots of input from the jQuery team and the jQuery community). Boris has written a great set of tutorials on the Templates plugin. The first tutorial in his series is located here: http://www.borismoore.com/2010/09/introducing-jquery-templates-1-first.html I want to thank John Resig, Richard Worth, Scott Gonzalez, Rey Bango, Jorn Zaefferer, Karl Swedberg and all of the other members of the jQuery team for working with the ASP.NET team and accepting our contributions to the jQuery project.

    Read the article

  • SIM to OIM Migration: A How-to Guide to Avoid Costly Mistakes (SDG Corporation)

    - by Darin Pendergraft
    In the fall of 2012, Oracle launched a major upgrade to its IDM portfolio: the 11gR2 release.  11gR2 had four major focus areas: More simplified and customizable user experience Support for cloud, mobile, and social applications Extreme scalability Clear upgrade path For SUN migration customers, it is critical to develop and execute a clearly defined plan prior to beginning this process.  The plan should include initiation and discovery, assessment and analysis, future state architecture, review and collaboration, and gap analysis.  To help better understand your upgrade choices, SDG, an Oracle partner has developed a series of three whitepapers focused on SUN Identity Manager (SIM) to Oracle Identity Manager (OIM) migration. In the second of this series on SUN Identity Manager (SIM) to Oracle Identity Manager (OIM) migration, Santosh Kumar Singh from SDG  discusses the proper steps that should be taken during the planning-to-post implementation phases to ensure a smooth transition from SIM to OIM. Read the whitepaper for Part 2: Download Part 2 from SDGC.com In the last of this series of white papers, Santosh will talk about Identity and Access Management best practices and how these need to be considered when going through with an OIM migration. If you have not taken the opportunity, please read the first in this series which discusses the Migration Approach, Methodology, and Tools for you to consider when planning a migration from SIM to OIM. Read the white paper for part 1: Download Part 1 from SDGC.com About the Author: Santosh Kumar Singh Identity and Access Management (IAM) Practice Leader Santosh, in his capacity as SDG Identity and Access Management (IAM) Practice Leader, has direct senior management responsibility for the firm's strategy, planning, competency building, and engagement deliverance for this Practice. He brings over 12+ years of extensive IT, business, and project management and delivery experience, primarily within enterprise directory, single sign-on (SSO) application, and federated identity services, provisioning solutions, role and password management, and security audit and enterprise blueprint. Santosh possesses strong architecture and implementation expertise in all areas within these technologies and has repeatedly lead teams in successfully deploying complex technical solutions. About SDG: SDG Corporation empowers forward thinking companies to strategize their future, realize their vision, and minimize their IT risk. SDG distinguishes itself by offering flexible business models to fit their clients’ needs; faster time-to-market with its pre-built solutions and frameworks; a broad-based foundation of domain experts, and deep program management expertise. (www.sdgc.com)

    Read the article

< Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >