Search Results

Search found 21923 results on 877 pages for 'software companies'.

Page 219/877 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • Who is Jeremiah Owyang?

    - by Michael Hylton
    Q: What’s your current role and what career path brought you here? J.O.: I'm currently a partner and one of the founding team members at Altimeter Group.  I'm currently the Research Director, as well as wear the hat of Industry Analyst. Prior to joining Altimeter, I was an Industry Analyst at Forrester covering Social Computing, and before that, deployed and managed the social media program at Hitachi Data Systems in Santa Clara.  Around that time, I started a career blog called Web Strategy which focused on how companies were using the web to connect with customers --and never looked back. Q: As an industry analyst, what are you focused on these days? J.O.: There are three trends that I'm focused my research on at this time:  1) The Dynamic Customer Journey:  Individuals (both b2c and b2b) are given so many options in their sources of data, channels to choose from and screens to consume them on that we've found that at each given touchpoint there are 75 potential permutations.  Companies that can map this, then deliver information to individuals when they need it will have a competitive advantage and we want to find out who's doing this.  2) One of the sub themes that supports this trend is Social Performance.  Yesterday's social web was disparate engagement of humans, but the next phase will be data driven, and soon new technologies will emerge to help all those that are consuming, publishing, and engaging on the social web to be more efficient with their time through forms of automation.  As you might expect, this comes with upsides and downsides.  3) The Sentient World is our research theme that looks out the furthest as the world around us (even inanimate objects) become 'self aware' and are able to talk back to us via digital devices and beyond.  Big data, internet of things, mobile devices will all be this next set. Q: People cite that the line between work and life is getting more and more blurred. Do you see your personal life influencing your professional work? J.O.: The lines between our work and personal lives are dissolving, and this leads to a greater upside of being always connected and have deeper relationships with those that are not.  It also means a downside of society expectations that we're always around and available for colleagues, customers, and beyond.  In the future, a balance will be sought as we seek to achieve the goals of family, friends, work, and our own personal desires.  All of this is being ironically written at 430 am on a Sunday am.  Q: How can people keep up with what you’re working on? J.O.: A great question, thanks.  There are a few sources of information to find out, I'll lead with the first which is my blog at web-strategist.com.  A few times a week I'll publish my industry insights (hires, trends, forces, funding, M&A, business needs) as well as on twitter where I'll point to all the news that's fit to print @jowyang.  As my research reports go live (we publish them for all to read --called Open Research-- at no cost) they'll emerge on my blog, or checkout the research tab to find out more now.  http://www.web-strategist.com/blog/research/ Q: Recently, you’ve been working with us here at Oracle on something exciting coming up later this week. What’s on the horizon?  J.O.: Absolutely! This coming Thursday, September 13th, I’m doing a webcast with Oracle on “Managing Social Relationships for the Enterprise”. This is going to be a great discussion with Reggie Bradford, Senior Vice President of Product Development at Oracle and Christian Finn, Senior Director of Product Management for Oracle WebCenter. I’m looking forward to a great discussion around all those issues that so many companies are struggling with these days as they realize how much social media is impacting their business. It’s changing the way your customers and employees interact with your brand. Today it’s no longer a matter of when to become a social-enabled enterprise, but how to become a successful one. Q: You’ve been very actively pursued for media interviews and conference and company speaking engagements – anything you’d like to share to give us a sneak peak of what to expect on Thursday’s webcast?  J.O.: Below is a 15 minute video which encapsulates Altimeter’s themes on the Dynamic Customer Journey and the Sentient World. I’m really proud to have taken an active role in the first ever LeWeb outside of Paris. This one, which was featured in downtown London across the street from Westminster Abbey was sold out. If you’ve not heard of LeWeb, this is a global Internet conference hosted by Loic and Geraldine Le Meur, a power couple that stem from Paris but are also living in Silicon Valley, this is one of my favorite conferences to connect with brands, technology innovators, investors and friends. Altimeter was able to play a minor role in suggesting the theme for the event “Faster Than Real Time” which stems off previous LeWebs that focused on the “Real time web”. In this radical state, companies are able to anticipate the needs of their customers by using data, technology, and devices and deliver meaningful experiences before customers even know they need it. I explore two of three of Altimeter’s research themes, the Dynamic Customer Journey, and the Sentient World in my speech, but due to time, did not focus on Adaptive Organization.

    Read the article

  • Ryan Weber On KCNext | #AJIReport

    - by Jeff Julian
    We sit down with Ryan Weber of KCNext in our office to talk about the Kansas City market for technology. The Technology Council of Greater Kansas City is committed to growing the existing base of technology firms, recruiting and attracting technology companies, aggregating and promoting our regional IT assets and providing peer interaction and industry news. During this show we talk about why KCNext is great for Kansas City. They offer some great networking and educational events, but also focus on connecting companies together to help build relationships on a business level. Make sure you visit their website to see what events are coming up and link up with them on Twitter to stay on top of news from the KC technology community. Listen to the Show Site: http://www.kcnext.com/ Twitter: @KCNext LinkedIn: KCNext - The Technology Council of Greater Kansas City

    Read the article

  • help needed for server hardware configuration

    - by sansknowledge
    hi, basically i am software guy got recently promoted to managerial cadre which requires giving recommendation for server to run software developed by our company , the software is a work flow management and the db is oracle 11 , approximately the size of daily transaction would be around 40 gb, and it should be connected to ~ 150 client machines , the client machine will be growing. help on terms of cpu, processor, memory , rack and stack or raid (i really yet to understand that concept) OS, will be greatly appreciated.

    Read the article

  • Why do I need to set up Autologon values in registry twice in before it works and can I fix this?

    - by jJack
    Background: As part an automated testing suite I am building, I need to set up Autologon on my virtual machines 'on demand'. By on demand, I mean that I don't want to necessarily pre-configure my VM or any snapshot to have Autologon set up already, for security reasons and also a huge business case. My solution so far: I'm copying a script to the guest machine and then using Sysinternals PsExec to execute it. The script is: reg add "hklm\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon" /f /v DefaultUserName /t REG_SZ /d myusername reg add "hklm\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon" /f /v DefaultPassword /t REG_SZ /d myfakepassword reg add "hklm\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon" /f /v DefaultDomainName /t REG_SZ /d mydomain reg add "hklm\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon" /f /v ForceAutoLogon /t REG_SZ /d 1 reg add "hklm\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon" /f /v AutoAdminLogon /t REG_SZ /d 1 reg add "hklm\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\AutoLogonChecked" /f /ve /d 1 Note: I don't believe AutoLogonChecked is required for machines post Windows 2000 but I'm doing it just in case for now. Maybe ForceAutoLogon isn't either, not sure yet. The Problem: I see PsExec executes this properly and all the values are in the registry, however when I restart the machine, the user isn't automatically logged on...When I run this a second time then restart the machine, the user is finally logged on. A diff between the registry states shows that the first time I run this, it is missing both the "1" for AutoAdminLogon, and also the DefaultPassword key. The second time I execute it, these values are correctly intact as I intended. So, what is going on here? Is this expected? This post claims in the end that it really all just works (the problem was that a logoff script was setting off the values). Doesn't seem to work for me however. Note this seems unique to Windows 7, does not occur in Windows XP Also note that you don't need PsExec to recreate the issue - just modify the registry yourself EDIT/update: Login interactively and run script (so, not executing it remotely), logging off automatically logs me back in (so, it works) remotely execute the script in guest when I'm interactively logged in, logging off automatically logs me back in (so, it works) remotely execute the script in guest when with non-interactive session if I log in afterwards (so, interactive now) then back off, it logs me back in (so, it then works) EDIT/update 2: This only occurs for Win7x86, Win7x64, Win8x64. This does not occur for Windows XP

    Read the article

  • layout analysis of text based pdf without ocr

    - by fastrack
    Before recognizing a pdf, OCR software do document layout analysis to determine which parts are texts, tables or images, as shown in the picture below. ![papercrop]http://cache.gawkerassets.com/assets/images/17/2011/07/papercrop.jpg I want to use some parts of the text while leaving out the others. So having a software marking those zones comes in handy. Papercrop does a decent job, but it has a bug of now showing some of the text in the pdf file. And OCR software can also do layout analysis, marking out "zones" which I can add or delete. But you have to OCR to do that. Since my pdfs are already text based, I don't want to waste so much time OCRing. So my question is, is there any software that automatically mark out those zones and let me manually manipulate them, without having to OCR? Thanks! Waiting for your help.

    Read the article

  • Mission critical embedded language

    - by Moe
    Maybe the question sounds a bit strange, so i'll explain a the background a little bit. Currently i'm working on a project at y university, which will be a complete on-board software for an satellite. The system is programmed in c++ on top of a real-time operating system. However, some subsystems like the attitude control system and the fault detection and a space simulation are currently only implemented in Matlab/Simulink, to prototype the algorithms efficiently. After their verification, they will be translated into c++. The complete on-board software grew very complex, and only a handful people know the whole system. Furthermore, many of the students haven't program in c++ yet and the manual memory management of c++ makes it even more difficult to write mission critical software. Of course the main system has to be implemented in c++, but i asked myself if it's maybe possible to use an embedded language to implement the subsystem which are currently written in Matlab. This embedded language should feature: static/strong typing and compiler checks to minimize runtime errors small memory usage, and relative fast runtime attitude control algorithms are mainly numerical computations, so a good numeric support would be nice maybe some sort of functional programming feature, matlab/simulink encourage you to use it too I googled a bit, but only found Lua. It looks nice, but i would not use it in mission critical software. Have you ever encountered a situation like this, or do you know any language, which could satisfies the conditions? EDIT: To clarify some things: embedded means it should be able to embed the language into the existing c++ environment. So no compiled languages like Ada or Haskell ;)

    Read the article

  • What is meant by "Repeat Business" ?

    - by vinoth
    Repeat Business obviously happens because the company has a great product or a great service. In the software industry, do companies make the code base complex enough so that the maintenance comes back to them? I have heard of cases where companies say "ya this code base has minor errors, let's ship them anyway, and let the customers come back for another change request on these". Then they would sometimes charge the customer for that. This question is specific to the software services industry. Do these things happen in the real world? I am trying to understand the business process.

    Read the article

  • Virtual Windows desktop

    - by Zack
    Is there any virtual desktop software that can virtualize desktop just like a virtual desktop to sandbox (sort of like virtualization and sandbox combined). I want to create many desktop and each of them must be sandbox. For example if I have virus infected in one of the desktop, I just have to clear or close the desktop and everything is fine. Is there any software that can do that kind of jobs that I have mentioned? Remember I am not asking recommendation of software, but the software that can do the job. UPDATE: I mean that every virtual desktop is working as a virtual box or sandbox. Clearing or closing the desktop will work as shutting down that virtual box.

    Read the article

  • ISP doesn't allow incoming connections (i.e., hosting a server) - anyway to get around this using a VPN?

    - by Josh1billion
    My ISP, like many today, doesn't allow incoming connections, so if I try to host server software on my home PC, then anyone (even myself) trying to connect to that server software via my public IP address is not able to establish a connection. This becomes a problem because hosting online games is impossible. I do have a VPS Linux box rented; is there any VPN software I could install on it that I could connect to from my home PC, and then anyone connecting to a specific port on that VPS will just have the traffic tunneled to my home PC, allowing me to host games that way? If so, what software do I need (on both my PC and on the VPS), and how do I configure it?

    Read the article

  • How to make a 100% UNIQUE dvd? [closed]

    - by sajawikio
    Is it possible to make a DVD which would be next to impossible to replicate exactly even with special equipment? To be clear, not in the sense of making the data itself resistant to piracy - nothing like that. More like as an analogy to antiques - a reproduction of a furniture would be able to be spotted to a trained eye, that it is not the original. Is it possible to do this for a dvd? Like, say a dvd is copied, even by someone who is trying to use even special equipment and is hypothetically dying to copy the dvd exactly for whatever reason - even such copied dvd would be detectable that it is not exactly the original dvd somehow (as if it were a reproduction of an original antique, but not the original), and would be almost or even preferably impossible to actually copy 100% exact the dvd ever again. Just some ideas below on the sort of thing i might go about doing to do this, but really am not sure how or what programs, media, hardware, etc. would do the trick Not sure what would do the trick -- but for instance do there exist any blank dvd's that already come pre-recorded with some sort of serial number or bar code, or other metadata, or an encrypted hash, or something like that? Maybe any blank dvd will do but i should get a special software to extract hardcoded metadata? If so which software? Or special hardware even maybe? Such dvd which the "secret" can be: Well, to know what the "secret" is and if it is present on the disk, it probably should be readable by some software or maybe a particular hardware (i guess preferably only if some sort of key is known and input into the software, even better, only then such secret data on disk can be read, otherwise nothing shows up and it looks like just a regular disk with no secret on it), and Would be impossible to actually replicate, especially not with regular burning hardware and preferably not at all. Other idea: Is there any special software that can direct the write head of laser to physically "mar" the dvd in such a way that, when played in dvd player, makes a particular visual pattern or something like that, also say the mar itself shows up as faint scratch on disk, but would be impossible for someone to do themselves exactly? EDIT: Also to clarify suppose the dvd contains video and music that should be playable on dvd players, maybe a menu too (i.e not a dvd containing software), and also to clarify the question is about how to make dvd 100% unique, not how to make the actual content of dvd protected from "piracy".

    Read the article

  • Is Linear Tape File System (LTFS) Best For Transportable Storage?

    - by rickramsey
    Those of us in tape storage engineering take a lot of pride in what we do, but understand that tape is the right answer to a storage problem only some of the time. And, unfortunately for a storage medium with such a long history, it has built up a few preconceived notions that are no longer valid. When I hear customers debate whether to implement tape vs. disk, one of the common strikes against tape is its perceived lack of usability. If you could go back a few generations of corporate acquisitions, you would discover that StorageTek engineers recognized this problem and started developing a solution where a tape drive could look just like a memory stick to a user. The goal was to not have to care about where files were on the cartridge, but to simply see the list of files that were on the tape, and click on them to open them up. Eventually, our friends in tape over at IBM built upon our work at StorageTek and Sun Microsystems and released the Linear Tape File System (LTFS) feature for the current LTO5 generation of tape drives as an open specification. LTFS is really a wonderful feature and we’re proud to have taken part in its beginnings and, as you’ll soon read, its future. Today we offer LTFS-Open Edition, which is free for you to use in your in Oracle Enterprise Linux 5.5 environment - not only on your LTO5 drives, but also on your Oracle StorageTek T10000C drives. You can download it free from Oracle and try it out. LTFS does exactly what its forefathers imagined. Now you can see immediately which files are on a cartridge. LTFS does this by splitting a cartridge into two partitions. The first holds all of the necessary metadata to create a directory structure for you to easily view the contents of the cartridge. The second partition holds all of the files themselves. When tape media is loaded onto a drive, a complete file system image is presented to the user. Adding files to a cartridge can be as simple as a drag-and-drop just as you do today on your laptop when transferring files from your hard drive to a thumb drive or with standard POSIX file operations. You may be thinking all of this sounds nice, but asking, “when will I actually use it?” As I mentioned at the beginning, tape is not the right solution all of the time. However, if you ever need to physically move data between locations, tape storage with LTFS should be your most cost-effective and reliable answer. I will give you a few use cases examples of when LTFS can be utilized. Media and Entertainment (M&E), Oil and Gas (O&G), and other industries have a strong need for their storage to be transportable. For example, an O&G company hunting for new oil deposits in remote locations takes very large underground seismic images which need to be shipped back to a central data center. M&E operations conduct similar activities when shooting video for productions. M&E companies also often transfers files to third-parties for editing and other activities. These companies have three highly flawed options for transporting data: electronic transfer, disk storage transport, or tape storage transport. The first option, electronic transfer, is impractical because of the expense of the bandwidth required to transfer multi-terabyte files reliably and efficiently. If there’s one place that has bandwidth, it’s your local post office so many companies revert to physically shipping storage media. Typically, M&E companies rely on transporting disk storage between sites even though it, too, is expensive. Tape storage should be the preferred format because as IDC points out, “Tape is more suitable for physical transportation of large amounts of data as it is less vulnerable to mechanical damage during transportation compared with disk" (See note 1, below). However, tape storage has not been used in the past because of the restrictions created by proprietary formats. A tape may only be readable if both the sender and receiver have the same proprietary application used to write the file. In addition, the workflows may be slowed by the need to read the entire tape cartridge during recall. LTFS solves both of these problems, clearing the way for tape to become the standard platform for transferring large files. LTFS is open and, as long as you’ve downloaded the free reader from our website or that of anyone in the LTO consortium, you can read the data. So if a movie studio ships a scene to a third-party partner to add, for example, sounds effects or a music score, it doesn’t have to care what technology the third-party has. If it’s written back to an LTFS-formatted tape cartridge, it can be read. Some tape vendors like to claim LTFS is a “standard,” but beauty is in the eye of the beholder. It’s a specification at this point, not a standard. That said, we’re already seeing application vendors create functionality to write in an LTFS format based on the specification. And it’s my belief that both customers and the tape storage industry will see the most benefit if we all follow the same path. As such, we have volunteered to lead the way in making LTFS a standard first with the Storage Network Industry Association (SNIA), and eventually through to standard bodies such as American National Standards Institute (ANSI). Expect to hear good news soon about our efforts. So, if storage transportability is one of your requirements, I recommend giving LTFS a look. It makes tape much more user-friendly and it’s free, which allows tape to maintain all of its cost advantages over disk! Note 1 - IDC Report. April, 2011. “IDC’s Archival Storage Solutions Taxonomy, 2011” - Brian Zents Website Newsletter Facebook Twitter

    Read the article

  • Create Hidden Partition on USB

    - by Francesco
    I need to split an USB flash disk into two USB drives, each one with its own drive letter, but one of these has to be hidden. In the non-hidden partition I want to place my software, and in the hidden partition I need to place some files that are required by the software in order to work. Moreover, only the software may read, write, delete or execute the files in this partition. I thought to use a little partition viewed as a CD-ROM drive, as they do in many flash drives, but this solution does not allow to write other data in a second moment, and it's visible to the user that can read the file. Obviously the software must be able to access to partition and read, write, delete or execute the content. Is there a solution to do so, possibly that work also on Linux?

    Read the article

  • MDM Poised for Growth

    - by david.butler(at)oracle.com
    David Nixon, an Oracle colleague of mine, was doing some research on MDM the other day. He came up with some well founded insights that I thought I’d share with you. Gartner recently published a note asking “Should Organizations Using ERP 'Do' Master Data Management?”  It may seem a bit strange but that’s a question Gartner has been asked by a number of companies as organizations are beginning to understand the importance of data governance and data stewardship.  That’s because ERP Suites typically “focus on integrating their own applications within suites, but have little interest in making their suites interoperate with the applications or suites of other vendors.”  Therefore, Gartner is advising customers that “have deployed or plan to support multiple packaged application suites (even from the same vendor) that have different semantic data and/or process models” to add an MDM solution. And it appears that customers are taking note.  In a more recent note entitled “Search Analytics Trends: Master Data Management”, Gartner noted that MDM searches on gartner.com in November 2010 “were 300% higher than [in] May 2009, indicating the increased interest an importance that businesses are placing on MDM.”  Why the increased interest?  Moving towards a single version of the truth is a familiar theme, but customers are talking more about the underlying business value that this enables.  For example, businesses are talking about the need to fix master data before they can successfully move forward on SOA initiatives.  And the growing demands for compliance continue to be a major driver.  In short, companies are talking more about specific and tangible business value, and they are looking for help creating business cases for an MDM initiative. Why This Matters Gartner’s notes make three things clear.  First, MDM is poised for growth as organizations gain a greater understanding for it and the need they have.  Many are still sorting it out, but the demand is growing and is sure to rise.  Second, any organization with a heterogeneous computing environment should invest in MDM.  Even solutions from the same vendor may have different data models and could benefit from MDM.  But the key to growth, or which vendors will benefit the most from it, is the third and perhaps most critical point: companies need help with the business case for MDM. Oracle can help your organization build a compelling business case for MDM. We have seen our 1100+ MDM customers gain competitive advantages in a wide variety of implementations. Give us a ring.

    Read the article

  • Log and Block Website using Windows 2008 SBS

    - by John
    A client has asked me to setup Windows 2008 SBS to block and log websites from a list they will provide. As far as I know they only have standard edition which means I cannot use ISA. I was thinking of using squid authenticated against Active Directory. There is no budge for additional software. Does any one know of a different/better solution using either open source software or software that is available in Windows 2008 SBS? Thanks

    Read the article

  • Oracle on Oracle: Is that all?

    - by Darin Pendergraft
    On October 17th, I posted a short blog and a podcast interview with Chirag Andani, talking about how Oracle IT uses its own IDM products. Blog link here. In response, I received a comment from reader Jaime Cardoso ([email protected]) who posted: “- You could have talked about how by deploying Oracle's Open standards base technology you were able to integrate any new system in your infrastructure in days. - You could have talked about how by deploying federation you were enabling the business side to keep all their options open in terms of companies to buy and sell while maintaining perfect employee and customer's single view. - You could have talked about how you are now able to cut response times to your audit and security teams into 1/10th of your former times Instead you spent 6 minutes talking about single sign on and self provisioning? If I didn't knew your IDM offer so well I would now be wondering what its differences from Microsoft's offer was. Sorry for not giving a positive comment here but, please your IDM suite is very good and, you simply aren't promoting it well enough” So I decided to send Jaime a note asking him about his experience, and to get his perspective on what makes the Oracle products great. What I found out is that Jaime is a very experienced IDM Architect with several major projects under his belt. Darin Pendergraft: Can you tell me a bit about your experience? How long have you worked in IT, and what is your IDM experience? Jaime Cardoso: I started working in "serious" IT in 1998 when I became Netscape's technical specialist in Portugal. Netscape Portugal didn't exist so, I was working for their VAR here. Most of my work at the time was with Netscape's mail server and LDAP server. Since that time I've been bouncing between the system's side like Sun resellers, Solaris stuff and even worked with Sun's Engineering in the making of an Hierarchical Storage Product (Sun CIS if you know it) and the application's side, mostly in LDAP and IDM. Over the years I've been doing support, service delivery and pre-sales / architecture design of IDM solutions in most big customers in Portugal, to name a few projects: - The first European deployment of Sun Access Manager (SAPO – Portugal Telecom) - The identity repository of 5/5 of the Biggest Portuguese banks - The Portuguese government federation of services project DP: OK, in your blog response, you mentioned 3 topics: 1. Using Oracle's standards based architecture; (you) were able to integrate any new system in days: can you give an example? What systems, how long did it take, number of apps/users/accounts/roles etc. JC: It's relatively easy to design a user management strategy for a static environment, or if you simply assume that you're an <insert vendor here> shop and all your systems will bow to that vendor's will. We've all seen that path, the use of proprietary technologies in interoperability solutions but, then reality kicks in. As an ISP I recall that I made the technical decision to use Active Directory as a central authentication system for the entire IT infrastructure. Clients, systems, apps, everything was there. As a good part of the systems and apps were running on UNIX, then a connector became needed in order to have UNIX boxes to authenticate against AD. And, that strategy worked but, each new machine required the component to be installed, monitoring had to be made for that component and each new app had to be independently certified. A self care user portal was an ongoing project, AD access assumes the client is inside the domain, something the ISP's customers (and UNIX boxes) weren't nor had any intention of ever being. When the Windows 2008 rollout was done, Microsoft changed the Active Directory interface. The Windows administrators didn't have enough know-how about directories and the way systems outside the MS world behaved so, on the go live, things weren't properly tested and a general outage followed. Several hours and 1 roll back later, everything was back working. But, the ISP still had to change all of its applications to work with the new access methods and reset the effort spent on the self service user portal. To keep with the same strategy, they would also have to trust Microsoft not to change interfaces again. Simply by putting up an Oracle LDAP server in the middle and replicating the user info from the AD into LDAP, most of the problems went away. Even systems for which no AD connector existed had PAM in them so, integration was made at the OS level, fully supported by the OS supplier. Sun Identity Manager already had a self care portal, combined with a user workflow so, all the clearances had to be given before the account was created or updated. Adding a new system as a client for these authentication services was simply a new checkbox in the OS installer and, even True64 systems were, for the first time integrated also with a 5 minute work of a junior system admin. True, all the windows clients and MS apps still went to the AD for their authentication needs so, from the start everybody knew that they weren't 100% free of migration pains but, now they had a single point of problems to look at. If you're looking for numbers: - 500K directory entries (users) - 2-300 systems After the initial setup, I personally integrated about 20 systems / apps against LDAP in 1 day while being watched by the different IT teams. The internal IT staff did the rest. DP: 2. Using Federation allows the business to keep options open for buying and selling companies, and yet maintain a single view for both employee and customer. What do you mean by this? Can you give an example? JC: The market is dynamic. The company that's being bought today tomorrow will be sold again. Companies that spread on different markets may see the regulator forcing a sale of part of a company due to monopoly reasons and companies that are in multiple countries have to comply with different legislations. Our job, as IT architects, while addressing the customers and employees authentication services, is quite hard and, quite contrary. On one hand, we need to give access to all of our employees to the relevant systems, apps and resources and, we already have marketing talking with us trying to find out who's a customer of the bough company but not from ours to address. On the other hand, we have to do that and keep in mind we may have to break up all that effort and that different countries legislation may became a problem with a full integration plan. That's a job for user Federation. you don't want to be the one who's telling your President that he will sell that business unit without it's customer's database (making the deal worth a lot less) or that the buyer will take with him a copy of your entire customer's database. Federation enables you to start controlling permissions to users outside of your traditional authentication realm. So what if the people of that company you just bought are keeping their old logins? Do you want, because of that, to have a dedicated system for their expenses reports? And do you want to keep their sales (and pre-sales) people out of the loop in terms of your group's path? Control the information flow, establish a Federation trust circle and give access to your apps to users that haven't (yet?) been brought into your internal login systems. You can still see your users in a unified view, you obviously control if a user has access to any particular application, either that user is in your local database or stored in a directory on the other side of the world. DP: 3. Cut response times of audit and security teams to 1/10. Is this a real number? Can you give an example? JC: No, I don't have any backing for this number. One of the companies I did system Administration for has a SOX compliance policy in place (I remind you that I live in Portugal so, this definition of SOX may be somewhat different from what you're used to) and, every time the audit team says they'll do another audit, we have to negotiate with them the size of the sample and we spend about 15 man/days gathering all the required info they ask. I did some work with Sun's Identity auditor and, from what I've been seeing, Oracle's product is even better and, I've seen that most of the information they ask would have been provided in a few hours with the help of this tool. I do stand by what I said here but, to be honest, someone from Identity Auditor team would do a much better job than me explaining this time savings. Jaime is right: the Oracle IDM products have a lot of business value, and Oracle IT is using them for a lot more than I was able to cover in the short podcast that I posted. I want to thank Jaime for his comments and perspective. We want these blog posts to be informative and honest – so if you have feedback for the Oracle IDM team on any topic discussed here, please post your comments below.

    Read the article

  • The king is dead, long live the king&ndash;Cloud Evening 15th Feb in London

    - by Eric Nelson
    Advert alert :-) The UK's only Cloud user group The Cloud is the hot topic. You can’t escape hearing about it everywhere you go. Cloud Evening is the UK’s only cloud-focussed user group. Cloud Evening replaces UKAzureNet, with a new objective to cover all aspects of Cloud Computing, across all platforms, technologies and providers. We want to create a community for developers and architects to come together, learn, share stories and share experiences. Each event we’ll bring you two speakers talking about what’s hot in the world of Cloud. Our first event was a great success and we're now having the second exciting instalment. We're covering running third party applications on Azure and federated identity management. We will, of course, keep you fed and watered with beer and pizza. Spaces are limited so please sign-up now! Agenda 6.00pm – Registration 6.30pm – Windows Azure and running third-party software - using Elevated Privileges, Full IIS or VM Roles  (by @MarkRendle): We all know how simple it is to run your own applications on Azure, but how about existing software? Using the RavenDB document database software as an example, Mark will look at three ways to get 3rd-party software running on Azure, including the use of Start-up Tasks, Full IIS support and VM Roles, with a discussion of the pros and cons of each approach. 7.30pm – Beer and Pizza. 8.00pm – Federated identity – integrating Active Directory with Azure-based apps and Office 365  (by Steve Plank): Steve will cover off how to write great applications which leverage your existing on-premises Active Directory, along with providing seamless access to Office 365. We hope you can join us for what looks set to be a great evening. Register now

    Read the article

  • Managing Social Relationships for the Enterprise – Part 1

    - by Michael Hylton
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Reggie Bradford, Senior Vice President, Oracle Today, Mark Hurd, President of Oracle, Thomas Kurian, Executive Vice President of Oracle and I discussed the strategic importance of how social media is impacting the enterprise and how it is changing the way customers, prospects employees and investors interact with brands worldwide. Oracle understands that the consumer is in control and as such, brands must evolve and change to meet growing needs. In addition, according to social media thought leader and Analyst from Altimeter Group, Jeremiah Owyang, companies now average 178 corporate-owned social media accounts. When Oracle added leading social marketing, listening analytics and development tools from Vitrue, Collective Intellect and Involver to its Oracle’s Cloud Services Suite we went beyond providing a single set of tools. We developed an entire framework to include a comprehensive social relationship management suite to help companies move beyond the social enterprise and achieve the social-enabled enterprise. The fundamental shift from transaction to engagement means that enterprises need not only a social strategy, but should also ensure that the information and data received from social initiatives flow back to marketing, sales, support and service. Doing so enables companies to deliver a proactive and compelling experience and provides analytics to turn engagement into opportunity – and ultimately that opportunity into revenue. On September 13, 2012, I am delighted to sit down with Jeremiah to further the discussion about how enterprises are addressing social media strategies and managing content. In addition, we will be taking your questions after the webinar via Twitter (@Oracle, @ReggieBradford, @cwfinn, @jowyang). Use #oracle and #socbiz to submit questions and follow the conversation. I look forward to speaking with you and answering your questions online. For more information about becoming a social-enabled enterprise, visit www.oracle.com/social. And don’t miss the insights of other social business thought leaders at www.oracle.com/goto/socialbusiness. Normal 0 false false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:"Arial","sans-serif"; mso-fareast-font-family:Cambria; mso-fareast-theme-font:minor-latin;}

    Read the article

  • Estimating cost of labor for a controlled experiment

    - by Lorin Hochstein
    Let's say you are a software engineering researcher and you are designing a controlled experiment to compare two software technologies or techniques (e.g., TDD vs. non-TDD, Python vs. Go) with respect to some qualities of interest (e.g., quality of resulting code, programmer productivity). According to your study design, participants will work alone to implement a non-trivial software system. You estimate it should take about six months for a single programmer to complete the task. You also estimate via power analysis that you will need around sixty study participants to obtain statistically significant results, assuming the technologies actually do yield different outcomes. To maximize external validity, you want to use professional programmers as study participants. Unfortunately, it isn't possible to find professional programmers who can volunteer for several months to work full-time on implementing a software system. You decide to go the simplest route and contract with a large IT consulting firm to obtain access to programmers to participate in the study. What is a reasonable estimate of the cost range, per person-month, for the programming labor? Assume you are constrained to work with a U.S.-based firm, but it doesn't matter where in the U.S. the firm itself or the programmers or located. Note: I'm looking for a reasonable order-of-magnitude range suitable for back-of-the-envelope calculations so that when people say "Why doesn't somebody just do a study to measure X", I can say, "Because running that study properly would cost $Y", and have a reasonable argument for the value of $Y.

    Read the article

  • Upgrade Windows 7 to Windows 8 using Technet?

    - by WillyWonka
    I want to go get a TechNet subscription to test some Windows software before I buy it. I want to replicate upgrading Windows 7 to Windows 8 with specific software in a virtual machine then see how stable or if possible to do it at all. I looked at the list of software but they only show Windows 8 Pro or Enterprise. Do you know if there is an Windows 7 to 8 Upgrade ISO available for Technet Standard or Pro?

    Read the article

  • #altnetseattle &ndash; Collaboration, Why is it so hard!

    - by GeekAgilistMercenary
    The session convened and we began a discussion about why collaboration is so hard. To work together in software better us engineers have to overcome traditional software approaches (silos of work) and the human element of tending to go off in a corner to work through an issue. It was agreed upon that software engineers are jack asses of jack assery. Breaking down the stoic & silent types by presenting a continuous enthusiasm until the stoic and silent types break down and open up to the group.  Knowing it is ok to ask the dumb question or work through basic things once in a while. Non-work interactions are pivotal to work related collaboration. Collaboration is mostly autonomous of process (i.e. Agile or Waterfall) Latency time should be minimal in the feedback loop for software development. Collaboration is enhanced by Agile Ideals, and things like Scrum or Lean Process. Agile is not a process, Lean and Scrum are process.  Agile is an ideal. Lean, Agile, Scrum, Waterfall, Six Sigma, CMMI, oh dear. . . Great session.  Off to the next session and more brain crunching. . . weeeeeeee!

    Read the article

  • Can't complete dropbox installation from behind proxy

    - by Mark Jones
    Problem: My PC on campus sits behind a proxy (requiring authentication) and I can't setup Dropbox. I am convinced that this is a proxy issue as I can't setup Ubuntu one either (but I don't use Ubuntu One so that is not a problem). I have looked at the Ubuntu One fix but it seems to be to modify settings explicitly related to Ubuntu One. I can install the nautilus-dropbox package (compiled from source and from .deb package from website and from software centre) but once I click OK from the "Dropbox Installation" dialog box (prompting me to download the proprietary daemon) the installation just freezes with the OK button pressed. When I look at its process in System Monitor its waiting channel is inet_wait_for_connect. I have set the following proxy directives thus far: Added mj22:**@proxy.waikato.ac.nz:80 information to network proxy settings under network in settings. Added http_host and http_port variables under gconf-editor-system-proxy Added 'host', 'authentication_password' 'authentication_user' and ticked 'user authentication' and 'use_http_proxy' under gconf-editor-system-http_proxy Added export http_proxy="http://mj22:**@proxy.waikato.ac.nz:80/" to /etc/bash.bashrc Added Acquire::http::proxy "http://mj22:**@proxy.waikato.ac.nz:80/"; to /etc/apt/apt.conf (which is what I imagine is letting Software Center retrieve packages). (where ** is my password) I have also added the equivalent ftp and https lines for the above entries. I get the internet fine and Software Centre can download packages but thats it. Related issues: The software centre can't fetch reviews (but can download packages). When trying to add an online account in Gnome 3 a dialog pop up appears with "Error getting a Request Token: Cannot connect to proxy (proxy.waikato.ac.nz)" Updates: After some time (10mins ish) Dropbox shows an error dialog box that reads: Trouble connecting to Dropbox servers. Maybe your internet connection is down, or you need to set you http_proxy environment variable. Is there a way I can see what environment variables are currently set?

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >