Search Results

Search found 17050 results on 682 pages for 'community management'.

Page 69/682 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • links for 2010-12-23

    - by Bob Rhubart
    Oracle VM Virtualbox 4.0 extension packs (Wim Coekaerts Blog) Wim Coekaerts describes the the new extension pack in Oracle VM Virtualbox 4.0 and how it's different from 3.2 and earlier releases. (tags: oracle otn virtualization virtualbox) Oracle Fusion Middleware Security: Creating OES SM instances on 64 bit systems "I've already opened a bug on this against OES 10gR3 CP5, but in case anyone else runs into it before it gets fixed I wanted to blog it too. (NOTE: CP5 is when official support was introduced for running OES on a 64 bit system with a 64 bit JVM)" - Chris Johnson (tags: oracle otn fusionmiddleware security) Oracle Enterprise Manager Grid Control: Shared loader directory, RAC and WebLogic Clustering "RAC is optional. Even the load balancer is optional. The feed from the agents also goes to the load balancer on a different port and it is routed to the available management server. In normal case, this is ok." - Porus Homi Havewala (tags: WebLogic oracle otn grid clustering) Magic Web Doctor: Thought Process on Upgrading WebLogic Server to 11g "Upgrading to new versions can be challenging task, but it's done for linear scalability, continuous enhanced availability, efficient manageability and automatic/dynamic infrastructure provisioning at a low cost." - Chintan Patel (tags: oracle otn weblogic upgrading) InfoQ: Using a Service Bus to Connect the Supply Chain Peter Paul van de Beek presents a case study of using a service bus in a supply channel connecting a wholesale supplier with hundreds of retailers, the overall context and challenges faced – including the integration of POS software coming from different software providers-, the solution chosen and its implementation, how it worked out and the lessons learned along the way. (tags: ping.fm) Oracle VM VirtualBox 4.0 is released! - The Fat Bloke Sings The Fat Bloke spreads the news and shares some screenshots.  (tags: oracle otn virtualization virtualbox) Leaks on Wikis: "Corporations...You're Next!" Oracle Desktop Virtualization Can Help. (Oracle's Virtualization Blog) "So what can you do to guard against these types of breaches where there is no outsider (or even insider) intrusion to detect per se, but rather someone with malicious intent is physically walking out the door with data that they are otherwise allowed to access in their daily work?" - Adam Hawley (tags: oracle otn virtualization security) OTN ArchBeat Podcast Guest Roster As the OTN ArchBeat Podcast enters its third year, it's time to acknowledge the invaluable contributions of the guests who have participated in ArchBeat programs. Check out this who's who of ArchBeat podcast panelists, with links to their respective interviews and more. (tags: oracle otn oracleace podcast archbeat) Show Notes: Architects in the Cloud (ArchBeat) Now available! Part 2 (of 4) of the ArchBeat interview with Stephen G. Bennett and Archie Reed, the authors of "Silver Clouds, Dark Linings: A Concise Guide to Cloud Computing." (tags: oracle otn podcast cloud) A Cautionary Tale About Multi-Source JNDI Configuration (Scott Nelson's Portal Productivity Ponderings) "I ran into this issue after reading that p13nDataSource and cgDataSource-NonXA should not be configured as multi-source. There were some issues changing them to use the basic JDBC connection string and when rolling back to the bad configuration the server went 'Boom.'" - Scott Nelson (tags: weblogic jdbc oracle jndi)

    Read the article

  • Algorithmia Source Code released on CodePlex

    - by FransBouma
    Following the release of our BCL Extensions Library on CodePlex, we have now released the source-code of Algorithmia on CodePlex! Algorithmia is an algorithm and data-structures library for .NET 3.5 or higher and is one of the pillars LLBLGen Pro v3's designer is built on. The library contains many data-structures and algorithms, and the source-code is well documented and commented, often with links to official descriptions and papers of the algorithms and data-structures implemented. The source-code is shared using Mercurial on CodePlex and is licensed under the friendly BSD2 license. User documentation is not available at the moment but will be added soon. One of the main design goals of Algorithmia was to create a library which contains implementations of well-known algorithms which weren't already implemented in .NET itself. This way, more developers out there can enjoy the results of many years of what the field of Computer Science research has delivered. Some algorithms and datastructures are known in .NET but are re-implemented because the implementation in .NET isn't efficient for many situations or lacks features. An example is the linked list in .NET: it doesn't have an O(1) concat operation, as every node refers to the containing LinkedList object it's stored in. This is bad for algorithms which rely on O(1) concat operations, like the Fibonacci heap implementation in Algorithmia. Algorithmia therefore contains a linked list with an O(1) concat feature. The following functionality is available in Algorithmia: Command, Command management. This system is usable to build a fully undo/redo aware system by building your object graph using command-aware classes. The Command pattern is implemented using a system which allows transparent undo-redo and command grouping so you can use it to make a class undo/redo aware and set properties, use its contents without using commands at all. The Commands namespace is the namespace to start. Classes you'd want to look at are CommandifiedMember, CommandifiedList and KeyedCommandifiedList. See the CommandQueueTests in the test project for examples. Graphs, Graph algorithms. Algorithmia contains a sophisticated graph class hierarchy and algorithms implemented onto them: non-directed and directed graphs, as well as a subgraph view class, which can be used to create a view onto an existing graph class which can be self-maintaining. Algorithms include transitive closure, topological sorting and others. A feature rich depth-first search (DFS) crawler is available so DFS based algorithms can be implemented quickly. All graph classes are undo/redo aware, as they can be set to be 'commandified'. When a graph is 'commandified' it will do its housekeeping through commands, which makes it fully undo-redo aware, so you can remove, add and manipulate the graph and undo/redo the activity automatically without any extra code. If you define the properties of the class you set as the vertex type using CommandifiedMember, you can manipulate the properties of vertices and the graph contents with full undo/redo functionality without any extra code. Heaps. Heaps are data-structures which have the largest or smallest item stored in them always as the 'root'. Extracting the root from the heap makes the heap determine the next in line to be the 'maximum' or 'minimum' (max-heap vs. min-heap, all heaps in Algorithmia can do both). Algorithmia contains various heaps, among them an implementation of the Fibonacci heap, one of the most efficient heap datastructures known today, especially when you want to merge different instances into one. Priority queues. Priority queues are specializations of heaps. Algorithmia contains a couple of them. Sorting. What's an algorithm library without sort algorithms? Algorithmia implements a couple of sort algorithms which sort the data in-place. This aspect is important in situations where you want to sort the elements in a buffer/list/ICollection in-place, so all data stays in the data-structure it already is stored in. PropertyBag. It re-implements Tony Allowatt's original idea in .NET 3.5 specific syntax, which is to have a generic property bag and to be able to build an object in code at runtime which can be bound to a property grid for editing. This is handy for when you have data / settings stored in XML or other format, and want to create an editable form of it without creating many editors. IEditableObject/IDataErrorInfo implementations. It contains default implementations for IEditableObject and IDataErrorInfo (EditableObjectDataContainer for IEditableObject and ErrorContainer for IDataErrorInfo), which make it very easy to implement these interfaces (just a few lines of code) without having to worry about bookkeeping during databinding. They work seamlessly with CommandifiedMember as well, so your undo/redo aware code can use them out of the box. EventThrottler. It contains an event throttler, which can be used to filter out duplicate events in an event stream coming into an observer from an event. This can greatly enhance performance in your UI without needing to do anything other than hooking it up so it's placed between the event source and your real handler. If your UI is flooded with events from data-structures observed by your UI or a middle tier, you can use this class to filter out duplicates to avoid redundant updates to UI elements or to avoid having observers choke on many redundant events. Small, handy stuff. A MultiValueDictionary, which can store multiple unique values per key, instead of one with the default Dictionary, and is also merge-aware so you can merge two into one. A Pair class, to quickly group two elements together. Multiple interfaces for helping with building a de-coupled, observer based system, and some utility extension methods for the defined data-structures. We regularly update the library with new code. If you have ideas for new algorithms or want to share your contribution, feel free to discuss it on the project's Discussions page or send us a pull request. Enjoy!

    Read the article

  • PASS Summit – looking back on my first time

    - by Fatherjack
      So I was lucky enough to get my first experience of PASS Summit this year and took some time beforehand to read some blogs and reference material to get an idea on what to do and how to get the best out of my visit. Having been to other conferences – technical and non-technical – I had a reasonable idea on the routine and what to expect in general. Here is a list of a few things that I have learned/remembered as the week has gone by. Wear comfortable shoes. This actually needs to be broadened to Take several pairs of comfortable shoes. You will be spending many many hours, for several days one after another. Having comfortable feet that can literally support you for the duration will make the week in general a whole lot better. Not only at the conference but getting to and from you could well be walking. In the evenings you will be walking around town and standing talking in various bars and clubs. Looking back, on some days I was on my feet for over 20 hours. Make friends. This is a given for the long term benefits it brings but there is also an immediate reward in being at a conference with a friend or two. Some events are bigger and more popular than others and some have the type of session that every single attendee will want to be in. This is great for those that get in but if you are in the bathroom or queuing for coffee and you miss out it sucks. Having a friend that can get in to a room and reserve you a seat is a great advantage to make sure you get the content that you want to see and still have the coffee that you need. Don’t go to every session you want to see This might sound counter intuitive and it relies on the sessions being recorded in some way to guarantee you don’t totally miss out. Both PASS Summit and SQL Bits sessions are recorded (summit is audio, SQLBits is video) and this means that if you get into a good conversation with someone over a coffee you don’t have to break it up to go to a session. Obviously there is a trade-off here and you need to decide on the tipping point for yourself but a conversation at a place like this could make a big difference to the next contract or employer you have or it might simply be great catching up with some friends you don’t see so often. Go to at least one session you don’t want to Again, this will seem to be contrary to normal logic but there is no reason why you shouldn’t learn about a part of SQL Server that isn’t part of your daily routine. Not only will you learn something new but you will also pick up on the feelings and attitudes of the people in the session. So, if you are a DBA, head off to a BI session and so on. You’ll hear BI speakers speaking to a BI audience and get to understand their point of view and reasoning for making the decisions they do. You will also appreciate the way that your decisions and instructions affect the way they have to work. This will help you a lot when you are on a project, working with multiple teams and make you all more productive. Socialise While you are at the conference venue, speak to people. Ask questions, be interested in whoever you are speaking to. You get chances to talk to new friends at breakfast, dinner and every break between sessions. The only people that might not talk to you would be speakers that are about to go and give a session, in most cases speakers like peace and quiet before going on stage. Other than that the people around you are just waiting for someone to talk to them so make the first move. There is a whole lot going on outside of the conference hours and you should make an effort to join in with some of this too. At karaoke evenings or just out for a quiet drink with a few of the people you meet at the conference. Either way, don’t be a recluse and hide in your room or be alone out in the town. Don’t talk to people Once again this sounds wrong but stay with me. I have spoken to a number of speakers since Summit 2013 finished and they have all mentioned the time it has taken them to move about the conference venue due to people stopping them for a chat or to ask a question. 45 minutes to walk from a session room to the speaker room in one case. Wow. While none of the speakers were upset about this sort of delay I think delegates should take the situation into account and possibly defer their question to an email or to a time when the person they want is clearly less in demand. Give them a chance to enjoy the conference in the same way that you are, they may actually want to go to a session or just have a rest after giving their session – talking for 75 minutes is hard work, taking an extra 45 minutes right after is unbelievable. I certainly hope that they get good feedback on their sessions and perhaps if you spoke to a speaker outside a session you can give them a mention in the ‘any other comments’ part of the feedback, just to convey your gratitude for them giving up their time and expertise for free. Say thank you I just mentioned giving the speakers a clear, visible ‘thank you’ in the feedback but there are plenty of people that help make any conference the success it is that would really appreciate hearing that their efforts are valued. People on the registration desk, volunteers giving schedule guidance and directions, people on the community zone are all volunteers giving their time to help you have the best experience possible. Send an email to PASS and convey your thoughts about the work that was done. Maybe you want to be a volunteer next time so you could enquire how you get into that position at the same time. This isn’t an exclusive list and you may agree or disagree with the points I have made, please add anything you think is good advice in the comments. I’d like to finish by saying a huge thank you to all the people involved in planning, facilitating and executing the PASS Summit 2013, it was an excellent event and I know many others think it was a totally worthwhile event to attend.

    Read the article

  • Should Git be used for documentation and project management? Should the code be in a separate repository?

    - by EmpireJones
    I'm starting up a Git repository for a group project. Does it make sense to store documents in the same Git repository as code - it seems like this conflicts with the nature of the git revision flow. Here is a summary of my question(s): Is the Git revisioning style going to be confusing if both code and documents are checked into the same repository? Experiences with this? Is Git a good fit for documentation revision control? I am NOT asking if a Revision Control System in general should or shouldn't be used for documentation - it should. Thanks for the feedback so far!

    Read the article

  • Feedback on meeting of the Linux User Group of Mauritius

    Once upon a time in a country far far away... Okay, actually it's not that bad but it has been a while since the last meeting of the Linux User Group of Mauritius (LUGM). There have been plans in the past but it never really happened. Finally, Selven took the opportunity and organised a new meetup with low administrative overhead, proper scheduling on alternative dates and a small attendee's survey on the preferred option. All the pre-work was nicely executed. First, I wasn't sure whether it would be possible to attend. Luckily I got some additional information, like children should come, too, and I was sold to this community gathering. According to other long-term members of the LUGM it was the first time 'ever' that a gathering was organised outside of Quatre Bornes, and I have to admit it was great! LUGM - user group meeting on the 15.06.2013 in L'Escalier Quick overview of Linux & the LUGM With a little bit of delay the LUGM meeting officially started with a quick overview and introduction to Linux presented by Avinash. During the session he told the audience that there had been quite some activity over the island some years ago but unfortunately it had been quiet during recent times. Of course, we also spoke about the acknowledged world dominance of Linux - thanks to Android - and the interesting possibilities for countries like Mauritius. It is known that a couple of public institutions have there back-end infrastructure running on Red Hat Linux systems but the presence on the desktop is still very low. Users are simply hanging on to Windows XP and older versions of Microsoft Office. Following the introduction of the LUGM Ajay joined into the session and it quickly changed into a panel discussion with lots of interesting questions and answers, sharing of first-hand experience either on the job or in private use of Linux, and a couple of ideas about how the LUGM could promote Linux a bit more in Mauritius. It was great to get an insight into other attendee's opinion and activities. Especially taking into consideration that I'm already using Linux since around 1996/97. Frankly speaking, I bought a SuSE 4.x distribution back in those days because I couldn't achieve certain tasks on Windows NT 4.0 without spending a fortune. OpenELEC Mediacenter Next, Selven gave us decent introduction on OpenELEC: Open Embedded Linux Entertainment Center (OpenELEC) is a small Linux distribution built from scratch as a platform to turn your computer into an XBMC media center. OpenELEC is designed to make your system boot fast, and the install is so easy that anyone can turn a blank PC into a media machine in less than 15 minutes. I didn't know about it until this presentation. In the past, I was mainly attached to Video Disk Recorder (VDR) as it allows the use of satellite receiver cards very easily. Hm, somehow I'm still missing my precious HTPC that I had to leave back in Germany years ago. It was great piece of hardware and software; self-built PC in a standard HiFi-sized (43cm) black desktop casing with 2 full-featured Hauppauge DVB-s cards, an old-fashioned Voodoo graphics card, WiFi card, Pioneer slot-in DVD drive, and fully remote controlled via infra-red thanks to Debian, VDR and LIRC. With EP Guide, scheduled recordings and general multimedia centre it offered all the necessary comfort in the living room, besides a Nintendo game console; actually a GameCube at that time... But I have to admit that putting OpenELEC on a Raspberry Pi would be a cool DIY project in the near future. LUGM - our next generation of linux users (15.06.2013) Project Evil Genius (PEG) Don't be scared of the paragraph header. Ish gave us a cool explanation why he named it PEG - Project Evil Genius; it's because of the time of the day when he was scripting down his ideas to be able to build, package and provide software applications to various Linux distributions. The main influence came from openSuSE but the platform didn't cater for his needs and ideas, so he started to work out something on his own. During his passionate session he also talked about the amazing experience he had due to other Linux users from all over the world. During the next couple of days Ish promised to put his script to GitHub... Looking forward to that. Check out Ish's personal blog over at hacklog.in. Highly recommended to read. Why India? Simply because the registration fees per year for an Indian domain are approximately 20 times less than for a Mauritian domain (.mu). Exploring the beach of L'Escalier af the meeting 'After-party' at the beach of L'Escalier Puh, after such interesting sessions, ideas around Linux and good conversation during the breaks and over lunch it was time for a little break-out. Selven suggested that we all should head down to the beach of L'Escalier and get some impressions of nature down here in the south of the island. Talking about 'beach' ;-) - absolutely not comparable to the white-sanded ones here in Flic en Flac... There are no lagoons down at the south coast of Mauriitus, and watching the breaking waves is a different experience and joy after all. Unfortunately, I was a little bit worried about the thoughtless littering at such a remote location. You have to drive on natural paths through the sugar cane fields and I was really shocked by the amount of rubbish lying around almost everywhere. Sad, really sad and it concurs with Yasir's recent article on the same topic. Resumé & outlook It was a great event. I met with new people, had some good conversations, and even my children enjoyed themselves the whole day. The location was well-chosen, enough space for each and everyone, parking spaces and even a playground for the children. Also, a big "Thank You" to Selven and his helpers for the organisation and preparation of lunch. I'm kind of sure that this was an exceptional meeting of LUGM and I'm really looking forward to the next gathering of Linux geeks. Hopefully, soon. All images are courtesy of Avinash Meetoo. More pictures are available on Flickr.

    Read the article

  • Should Git be used for documentation and project management? Should the code be in a separate repository?

    - by EmpireJones
    I'm starting up a Git repository for a group project. Does it make sense to store documents in the same Git repository as code - it seems like this conflicts with the nature of the git revision flow. Here is a summary of my question(s): Is the Git revisioning style going to be confusing if both code and documents are checked into the same repository? Experiences with this? Is Git a good fit for documentation revision control? I am NOT asking if a Revision Control System in general should or shouldn't be used for documentation - it should. Thanks for the feedback so far!

    Read the article

  • DevPartner Profiler Community Edition

    - by Henry
    I used to have DevPartner Profiler Community Edition installed on my machine, which was free to use indefinitely and was compatible with Visual Studio 2003. After I have rebuild my machine and downloaded a newer version (version 8.1) of the software, it turns out it is actually a 45 day trial eventhough it is still called a Community Edition. I have tried looking for an older version, but haven't been able to find it. Could someone help me out please?

    Read the article

  • Use msbuild community tasks without installing

    - by mickyjtwin
    In our developer environment, no users have administration rights. As such, it's not possible to install MSBuild.CommunityTasks without getting admin to do so. What I'm wondering is similar to NAnt, i.e. is it possible to include the files in your solution directory, and just reference it from there? This way, will not matter if the person has it installed, when they checkout latest code, the msbuild community files will come with, and solution will build. \SolutionDir\{solution}.sln \SolutionDir\Project\{files} \SolutionDir\MSBuild.Community\files

    Read the article

  • Patch Management and System Inventory on a Windows network?

    - by Scott
    What are some good ways to have patch management and systems/hardware inventory for a Windows (Server 2003 or 2008) network? For example, at a minimum knowing the basics for all the machines out on the network such as OS version, patch level, what hotfixes they have, processor, ram, etc. Even better would be knowing more details such as peripherals. Ideally if there were a way to push service packs, and hotfixes (and other software?) to the machines, that would be great. What are some options for this?

    Read the article

  • Open-source training class/room/instructor resource management software?

    - by Kyle Eli
    We're looking to replace an internal system used for managing training classes with something a bit more robust. Needs to be open-source or have a license level that grants access to source, and needs to be ASP.net (C# preferred, but could live with VB.net) Ultimately, we'll need to be able to assign facilities and instructors, manage attendees, send notifications, and build calendar views. We'll also be integrating with our website to allow on-line sign-up and other things for attendees to manage on their own. We do expect to implement quite a bit of it in-house, but we'd like as broad of a base to start from as we can get. Still, just a really good web-based meeting-room reservation system might make a good enough starting point. In list form: Meeting/training resource management softwareASP.net (C# or VB.net)Source availableWe're expecting to have to modify the software to meet all of our requirements

    Read the article

  • What poor management decisions have you had to deal with?

    - by tombull89
    As a junior technician I've had to deal with (or will have to deal with) some problems in the past and only being a junior technician I don't have the confidence or respect from management staff to speak up. For instance, we're having a entirely new system. From Windows Server 2003/XP going to Windows Server 2008 R2/Windows 7/VMWare/Digital Signage and the current amount of time dedicated to the training of the IT support department currently stands at 0. They seem to think that all IT systems are the same and are going to get a bit of shock when I can't help them. I think there;s some UK legislation saying a school/business have to put money and time aside for training, but I'm not sure. What have you had to deal with?

    Read the article

  • Have you successfully installed Active Directory Management Gateway Service on 2008?

    - by ssg31415926
    I've got a pair of 2008 DCs onto which I've been trying to install the Active Directory Management Gateway Service. (I only wanted it on one but when that didn't work I tried another.) Both are failing with: "The update does not apply to your system". They're both Windows Server 2008 Domain Controllers, x64 version. Both have .NET 3.5 SP1 installed on them. One has Service Pack 2 installed and the other has the hotfix from 967574 installed. Both servers were rebooted after the installed of each of the required updates. I have attempted to install Windows6.0-KB968934-x64.msu on both and get the error reported above. According to the docs, my machines meet the requirements, so something must be missing from the requirements. Or there's something odd about these DCs. Has anyone succeeded? If so, did you have to install/remove anything (else) to get it installed?

    Read the article

  • what poor management decsions have you had to deal with?

    - by tombull89
    As a junior technician I've had to deal with (or will have to deal with) some problems in the past and only being a junior technicain I don't have the confidence or respect from management staff to speak up. For instance, we're having a entirely new system. From Server 2003/XP going to Sevrer 2008 R2/Win7/VMWare/Digital Signage and the current amount of time dedicated to the training of the IT support department currently stands at 0. They seem to think that all it systems are the same and are going to get a bit of shock when I can't help them. I think there;s some UK legislation saying a school/business have to put money and time aside for training, but I'm not sure. What have you had to deal with?

    Read the article

  • SQL SERVER – SELECT TOP Shortcut in SQL Server Management Studio (SSMS)

    - by pinaldave
    This is tool is pretty old, yet always comes as a handy tip. I had a great trip at TechEd in India. And, during one of my presentations, I was asked if there are any shortcuts to SELECT only TOP 100 records from SSMS. I immediately told him that if he explores the table in SSMS, he can just right click on it and SELECT TOP 1000 records. If he wanted only 100 records, then he could edit that 1000 to 100 by means of going to Options. Go to Options, then hover the mouse over the SQL Server Object Explorer, then proceed to Commands. Afterwards, change the Value for Select Top <n> Audit Records. After narrating the steps, he told me that he was not looking for the right click option; rather he was asking if there is any kind of keyboard shortcut for convenience’s sake. Actually, a keyboard shortcut is also possible. SQL Server Management Studio (SSMS) lets you configure the settings you want using a shortcut. Here is how you can do it. Go to Options, then to Environment. Proceed to Keyboard, and from there, configure your T-SQL with the desired keyword. Now, open SSMS New Query Window, and then click and type in any table name.  After that, just hit the shortcut you just made earlier. Doing this should display TOP 100 records in the Result window. I am sure this trick is quite old, but it is still helpful to many. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Add-On, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • It&rsquo;s All About Expectation Management

    - by D'Arcy Lussier
    I saw this tweet from Gerald Weinberg today: I’d expand on this – its not just managers, its our clients as well. With so much focus on “agile” and reducing the amount of wasteful documentation created, those that typically consume traditional deliverables haven’t caught up. For many, there still is a correlation between seeing a mountain of paper, or a 30 page Word document, or a 40 slide PowerPoint, and feeling like some “work” was done. The “Value Driven Development” movement is still in its infancy, even with the adoption and success stories. So, we have two options – we can complain about it, or we can learn how to live with it while continuing to evangelize about the benefits of value over bloat. The reality is that perceived value is still value, so what’s important – especially in a situation as Gerald mentions where management or clients don’t understand the work – is to find out what the manager/client values and deliver to that. That doesn’t mean you don’t discuss it. That doesn’t mean that if you see risks being represented in what a manager/client is asking you don’t question it and provide alternatives. But it does mean that you don’t slam the door on it – you don’t just toss it aside and ignore what their perceived value is. The world isn’t perfect, primarily because its filled with imperfect people. The only way to get better is to engage and not dismiss each other, even if we disagree on value.

    Read the article

  • Complex knowledge management system with CRM..written internally

    - by JonH
    We've all heard of salesforce and sugarcrm and the likes of systems like this. Unfortunately at my workplace we have been asked to write a similiar system (rather then license or purchase). Basically the database is fairly large. Think of modules such as: Corporate groups, customers, programs, projects, sub projects, and issue management. In simple terms a corporate group has one to many customers. A program has one or more projects. A project has one or more sub projects. And an issue can be created on many sub projects. Of course the system is a bit more complex but instead of listing every single module I think its best to keep it simple. In any event, the system in its current state has only two resources to be working on it (basically we have to do it all: CSS, database, jquery, asp.net and C#). We've started off well by defining the UI master and footer pages that way we can reuse those across all of our pages. Now comes the hard part. The system will have about 4k end users with say 5-10% being concurrent users. We are wondering if it makes sense to cache our database data (For say 5-10 minutes) rather then continously hit our database. The reason being is some of these pages may have 5-10 search filters associated with the page. Imagine every time a selection is made from a search box how many database hits. Also some of these search fields cascade so selecting for instance an initial drop down may cascade several drop down boxes under them. Is it wrong to cache because I am not finding too many articles on whether it is a good idea or not. Remember the system is similiar to say a CRM system where we manage our various customers, projects, sub projects, issues, etc.

    Read the article

  • Adopting Technologies for the Sake of Technologies

    - by shiju
    Unlike other engineering industries, the software engineering industry is really lacking maturity. The lack of maturity can see in different aspects of entire software development life cycle. I think other engineering industries are well organised and structured with common, proven engineering practices. The software engineering industry is greatly a diverse industry with different operating systems, and variety of development platforms, programming languages, frameworks and tools. Now these days, people are going behind the hypes and intellectual thoughts without understanding their core business problems and adopting technologies and practices for the sake of technologies and practices and simply becoming a “poster child” of technologies and practices. Understanding the core business problem and providing best, solid solution with a platform neutral approach, will give you more business values and ROI, instead of blindly adopting technologies and tailor-made your applications for the sake of technologies and practices. People have been simply migrating their solutions in favour of new technologies and different versions of frameworks without any business need. The “Pepsi Challenge” in the Software Development  Pepsi Challenge marketing campaign of the 1980s was a popular and very interesting marketing promotion in which people taste one cup of Pepsi and another cup with Coca Cola. In the taste test, more than 50% of people were preferred Pepsi  over Coca Cola. The success story behind the Pepsi was more sweetness contains in the Pepsi cola. They have simply added more sugar and more people preferred more sweet flavour. You can’t simply identify the better one after sipping one cup of cola based on the sweetness which contains. These things have been happening in the software industry for choosing development frameworks and technologies. People have been simply choosing frameworks based on the initial sugary feeling without understanding its core strengths and weakness. The sugary framework might be more harmful when you develop real-world systems. There is not any silver bullet for solving all kind of problems and frameworks and tools do have strengths and weakness. So it would be better to understand their strength and weakness. And please keep in mind that you have to develop real apps to understand the real capabilities and weakness of a framework. Evaluating a technology based on few blog posts will harm your projects and these bloggers might be lacking real-world experience with the framework. The Problem with Align a Development Practice with Tools Recently I have observed a discussion in a group where one guy asked suggestions for practicing Continuous Delivery (CD) as part of the agile based application engineering. Then the discussion quickly went to using and choosing a Continuous Integration (CI) tool and different people suggested different Continuous Integration (CI) tools for simply practicing Continuous Delivery. If you have worked with core agile engineering practices, you could clearly know that the real essence of agile is neither choosing a tool nor choosing a process. By simply choosing CI tool from a particular vendor will not ensure that you are delivering an evolving software based on customer feedback. You have to understand the real essence of a engineering practice and choose a right tool for practicing it instead of simply focus on a particular tool for a practicing an development practice. If you want to adopt a practice, you need a solid understanding on it with its real essence where tools are just helping us for better automation. Adopting New Technologies for the Sake of Technologies The another problem is that developers have been a tendency to adopt new technologies and simply migrating their existing apps to new technologies. It is okay if your existing system is having problem  with a technology stack or or maintainability challenge with existing solution, and moving to new technology for solving the current problems. We have been adopting new technologies for solving new challenges like solving the scalability challenges when the application or user bases is growing unpredictably. Please keep in mind that all new technologies will become old after working with it for few years. The below Facebook status update of Janakiraman, expresses the attitude of a typical customer. For an example, Node.js is becoming a hottest buzzword in the software industry and many developers are trying to adopt Node.js for their apps. The important thing is that Node.js is a minimalist framework that does some great things for some problems, but it’s not a silver bullet. I have been also working with Node.js which is good for some problems, but really bad for choosing it for all kind of problems. By adopting new technologies for new projects is good if we could get real business values from it because newer framework would solve some existing well known problems and provide better solutions where it can incorporate good solutions for the latest challenges . But adopting a new technology for the sake of new technology is really bad idea. Another example is JavaScript is getting lot of attention so that lot of developers are developing heavy JavaScript centric web apps. First, they will adopt a client-side JavaScript MV* framework from AngularJS, Ember, Backbone etc, and develop a Single Page App(SPA) where they are repeating the mistakes we did in the past with server-side. The mistakes we did in the server-side is transforming to client-side. The problem is that people are just adopting new technologies, but not improving their solutions. I predict that many Single Page App will suck in the future. We need a hybrid approach where we should be able to leverage both server-side and client-side for developing next-generation web apps. The another problem is that if you like a particular framework, use it for all kind of apps. In the past, I know some Silverlight passionate guys were tried to use that framework for all kind of apps including larger line of business apps. And these days developers are migrating their existing Silverlight apps in favour of HTML5 buzzword. So the real question is, what is the business values we are getting from these apps when we are developing it for the sake of a particular technology instead of business need. The another problem is that our solutions consultants are trying to provide unnecessary solutions for the sake of a particular technology or for a hype. For an example, Big Data solutions are great for solving the problem of three Vs : volume, velocity and variety. But trying to put this for every application will make problems. Let’s say, there is a small web site running with limited budget and saying that we need a recommendation engine for the web site with a Hadoop based solution with a 16 node cluster, would be really horrible. If we really need a Hadoop based solution, got for it, but trying to put this for all application would be a big disaster. It would be great if could understand the core business problems first, and later choose a right framework for providing solutions for the actual business problem, instead of trying to provide so many solutions. The Problem with Tied Up to a Platform Vendor Some organizations and teams are tied up with a particular platform vendor where they don’t want to use any product other than their preferred or existing platform vendor. They will accept any product provided by the vendor regardless of its capability. This will lets you some benefits regards with integration and collaboration of different products provided by the same vendor, but it will loose your opportunity to provide better solution for your business problems. For a real world sample scenario, lot of companies have been using SAP for their ERP solutions. When they are thinking about mobility or thinking about developing hybrid mobile apps, they can easily find out a framework from SAP. SAP provides a framework for HTML 5 based UI development named SAPUI5. If you are simply adopting that framework only based for the preference of existing platform vendor, you might be loose different opportunities for providing better solution. Initially you might enjoy the sugary feeling provided by the platform vendor, but you have to think about developing apps which should be capable for solving future challenges. I am not saying that any framework is not good and I believe that all frameworks are good over another one for solving at least one problem. My point is that we should not tied up with any specific platform vendor unless your organization is having resource availability problems. Being Polyglot for Providing Right Solutions The modern software engineering industry is greatly diverse with different tools and platforms. Lot of open source frameworks and new programming languages have been releasing to the developer community, where choosing the right platform without any biased opinion, is really a difficult task. But it would really great if we could develop an attitude with platform neutral mindset and being a polyglot developer for providing better solutions based on the actual business problems. IMHO, we should learn a new programming language and a new framework every year. This will improve the quality of our developer capabilities and also improve the quality of our primary programming language skills. Being polyglot for individual developers and organizational teams will give you greater opportunity to your developer experience and also for your applications. Organizations can analyse their business problem without tied with any technology and later they can provide solutions by choosing different platform and tools. Summary    In this blog post, what I was trying to say that we should not tied up or biased with any development platform, technology, vendor or programming language and we should not adopt technologies and practices for the sake of technologies. If we are adopting a technology or a practice for the sake of it, we are simply becoming a “poster child” of the technology and practice. We should not become a poster child of other people’s intellectual thoughts and theories, instead of it we should become solutions developers and solutions consultants where we should be able to provide better solutions for the business problems. Being a polyglot developer is a good idea for improving your developer skills which lets you provide better solutions for the business problems. The most important thing is that we should become platform neutral developers where our passion should be for providing brilliant solutions. It would be great if we could provide minimalist, pragmatic business solutions. You can follow me on Twitter @shijucv

    Read the article

  • Mark Wilcox Discusses Privileged Account Management

    - by Naresh Persaud
    96 800x600 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:Calibri;} The new release of Oracle Identity Management 11g R2 includes the capability to manage privileged accounts. Privileged accounts, if compromised, create a risk for fraud in the enterprise and as a result controlling access to privileged accounts is critical. The Oracle Privileged Account Manager solution can be deployed stand alone or in conjunction with the Oracle Governance Suite for a comprehensive solution. As part of the comprehensive platform, Privilege Account Manager is interoperable with the Identity suite. In addition, Privileged Account Manager can re-use Oracle Identity Manager connectors for propagating changes to target systems. The two are interoperable at the data level. I caught up with Mark Wilcox, Principal Product Manager of Oracle Privileged Account Manager and discussed with him the capabilities of the offering in this podcast. Click here to listen.

    Read the article

  • Java game object pool management

    - by Kenneth Bray
    Currently I am using arrays to handle all of my game objects in the game I am making, and I know how terrible this is for performance. My question is what is the best way to handle game objects and not hurt performance? Here is how I am creating an array and then looping through it to update the objects in the array: public static ArrayList<VboCube> game_objects = new ArrayList<VboCube>(); /* add objects to the game */ while (!Display.isCloseRequested() && !Keyboard.isKeyDown(Keyboard.KEY_ESCAPE)) { for (int i = 0; i < game_objects.size(); i++){ // draw the object game_objects.get(i).Draw(); game_objects.get(i).Update(); //world.updatePhysics(); } } I am not looking for someone to write me code for asset or object management, just point me into a better direction to get better performance. I appreciate the help you guys have provided me in the past, and I dont think I would be as far along with my project without the support on stack exchange!

    Read the article

  • Customers Live on Oracle Fusion Human Capital Management

    - by Scott Ewart
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Oracle HCM Cloud Service Helps Power HR’s Contribution to the Business. More than 25 of the 100-plus customers who have selected Oracle Fusion Human Capital Management (HCM) are already live. Ardent Leisure, Peach Aviation, Toshiba Medical Systems and Zillow have deployed Oracle HCM Cloud Service and are using it to transform their HR operations. They join companies such as Principal Financial Group and Elizabeth Arden, who are already using Oracle HCM Cloud Service to help manage international growth and deliver pervasive, role-based, configurable solutions to their employees. See The Full Press Release Here: http://www.oracle.com/us/corporate/press/1859573?sc=OPR-TW

    Read the article

  • How to Manage and Use LVM (Logical Volume Management) in Ubuntu

    - by Justin Garrison
    In our previous article we told you what LVM is and what you may want to use it for, and today we are going to walk you through some of the key management tools of LVM so you will be confident when setting up or expanding your installation. As stated before, LVM is a abstraction layer between your operating system and physical hard drives. What that means is your physical hard drives and partitions are no longer tied to the hard drives and partitions they reside on. Rather, the hard drives and partitions that your operating system sees can be any number of separate hard drives pooled together or in a software RAID Latest Features How-To Geek ETC Inspire Geek Love with These Hilarious Geek Valentines How to Integrate Dropbox with Pages, Keynote, and Numbers on iPad RGB? CMYK? Alpha? What Are Image Channels and What Do They Mean? How to Recover that Photo, Picture or File You Deleted Accidentally How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin How to Kid Proof Your Computer’s Power and Reset Buttons Microsoft’s Windows Media Player Extension Adds H.264 Support Back to Google Chrome Android Notifier Pushes Android Notices to Your Desktop Dead Space 2 Theme for Chrome and Iron Carl Sagan and Halo Reach Mashup – We Humans are Capable of Greatness [Video] Battle the Necromorphs Once Again on Your Desktop with the Dead Space 2 Theme for Windows 7

    Read the article

  • Customer Experience Management for Retail 2.0 - part 2 / 2

    - by Sanjeev Sharma
    In the previous post, i discussed some of the key trends shaping up in the retail industry, their implications and the challenges facing retailers seeking to regain control of the buyer-seller relationship. Is Customer Experience Management the panacea for the ailing retailers who are now awakening to the power of the consumer? Quite honestly, customer acquisition, retention and satisfaction have been top of mind for retailers for quite some time now. The missing piece of this puzzle is bringing all those countless hours of strategy and planning to fruition. This is more of an execution gap than anything else. Although technology has made consumers more informed, more mobile and more social, customer experience is still largely defined by delivering on the following: Consistent experiences, whether shopping online or offline Personalize-able interaction ("mass market" sounds good as an internal strategy but not when you are a buyer!) Timely order fulfillment, if not pro-active notification of delays Below is a concept architecture for streamlining front-end, mid-office and back-end interfaces through shared process to achieve consistency and efficiency in managing the customer experience from order capture to order provisioning.

    Read the article

  • Adventures in Lab Management Configuration: CMMI Edition Part 1 of 3

    - by Enrique Lima
    I remember at one point someone telling me how close Migrate was to Migraine. This was a process that included an environment from TFS 2008 to TFS 2010, needed to be migrated too as far as the process template goes.  Here we are talking about CMMI v4.2 to CMMI v5.0.  Now, the process to migrate the TFS Infrastructure is one thing, migrating the Process Template is a different deal, not hard … just involved. Followed a combination of steps that came from a blog post as the main guidance and then MSDN (as suggested on the guidance post) to complement some tasks and steps. Again, the focus I have here is CMMI. The high level steps taken to enable the TFS 2008 CMMI v4.2 migrated to TFS 2010 Process Template are: 1)  Backup the Collection, Configuration and Warehouse Databases. 2)  Downloaded the Process Template using Visual Studio 2010. 3) Exported, modified and imported Bug Type Definition 4) Exported, modified and imported Scenario or Requirement Type Definition. 5) Created and imported bug field mappings. Now, we can attempt to connect using Test Manager, and you should be able to get this going. After that was done, it was time to enroll VMs that already existed in the environment.  This was a bit more challenging, but in the end it was a matter of just analyzing the changes that had been made to had a temporary work around from the time we migrated to the time we converted the Work Items and such and added fields to enable communication between the project and the Test and Lab Manager component. There are 2 more parts to this post, the second will describe the detailed steps taken to complete the Process Template update and the third will talk about the gotchas and fixes for the Lab Management portion.

    Read the article

  • ING Selects Oracle Fusion Human Capital Management

    - by Scott Ewart
    Leading Financial Services Firm Seeks To Strengthen HR's Role In Driving The Business Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} ING Bank Netherlands, a leading financial services organization, has selected Oracle Fusion Human Capital Management (HCM). ING’s decision to deploy Oracle Fusion HCM was driven by its ongoing desire to strengthen HR's role in driving the business. Read more here.

    Read the article

  • Very slow wireless connection (unrelated to power management) using Netgear N150 WNA1100 in 12.04

    - by vgaldikas
    I am using a Netgear N150 WNA1100 USB Adapter. The network driver is: ath9k_htc. When I use Ubuntu 12.04 my wireless connection is much slower than when I use the WNA1100 with my Windows 7 installation on the same machine. I have been researching this and the most common reason for this seems to be related to power management. However for me this is off by default. I had entered a bunch of different commands and, at one stage, I got it to work at full power. But then I rebooted the machine and it went back to slow again. After entering the different commands I was afraid that I may have messed something up, so I performed a fresh install of Ubuntu. I still have the same problem. My wireless connection is still much slower when using Ubuntu than when using Windows 7. I am not sure what other info could help... EDIT: Ok, I have made live usb from the iso file I used to install Ubuntu on this pc, and ran it on my laptop. Still the same problem. However la[top works grand on cable. I suppose this means that my wireless adapter or network driver can be eliminated as root of the problem.. So maybe to do with router itself? I use ZyXEL P-660HW-D1

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >