Search Results

Search found 16836 results on 674 pages for 'power management'.

Page 95/674 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • SQL SERVER – Identify Most Resource Intensive Queries – SQL in Sixty Seconds #028 – Video

    - by pinaldave
    During performance tuning conversation the very first question people often ask is what are the queries offending the server or in another word let us identify the queries which are the most resource intensive. The resources are often described as either Memory, CPU or IO. When we talk about the queries the same is applicable for them as well. The query which is doing lots of reads or writes are for sure resource intensive as well query which are taking maximum CPU time. Performance tuning is a very deep subject and we all have our own preference regarding what should be the first step to tuning and what should be looked with the salt of grain. Though there is no denying that a query which uses more resources than what it should be using for sure require tuning. There are many ways to do identify query using intense resources (e.g. Extended events etc) but in this one we will go by simple DMV. There is a small gotcha we all have to remember about usage of DMV is that it only brings back results from existing cache. So if you have a query which is very resource intensive but is not cached or if you have explicitly removed the query from the cache it will be not part of the result returned by this DMV. It is quite possible that a query is aged and removed from the cache if your cache is not huge. If your cache is large you may want to be careful in running this query during business hours as this query itself can be resource intensive. Get Script to identify resource intensive query from Here Related Tips in SQL in Sixty Seconds: SQL SERVER – Find Most Expensive Queries Using DMV Simple Example to Configure Resource Governor – Introduction to Resource Governor SQL SERVER – DMV – sys.dm_exec_query_optimizer_info – Statistics of Optimizer SQL SERVER – Wait Stats – Wait Types – Wait Queues – Day 0 of 28 Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video Tagged: Excel

    Read the article

  • Tips To Manage An Effectively Come Back To Work After A Long Vacation

    - by Gopinath
    Vacations are very relaxing – no need to reply to endless mails, no marathon meeting or conference calls. It’s all about fun during the vacation. The troubles begin as you near the end of vacation and plans to think about getting back to work. Once we are back to work after a long vacation there will be many things to worry – a pile of snail mails, hundreds of unread emails,  a flood of phone calls to answer and a stream of scheduled meetings. How to handle all the backlog and catch up quickly with the inflow of work? Here is a management tip from Harvard Business Review blog to get back to work the right way after a long vacation Block off your morning. Make sure you don’t have any meetings scheduled or big projects due. Then before you open your inbox, pause and think about your work priorities. As you make your way through emails and voicemails, focus on returning the messages that are connected to what matters most. Defer or delegate things that aren’t top priority. And remember it will probably take more than one day to get caught up, so be easy on yourself. Hope these tips lets you plan a right comeback to work after your vacation. cc Image credit: flickr/dfwcre8tive This article titled,Tips To Manage An Effectively Come Back To Work After A Long Vacation, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • SQL SERVER – Importing CSV File Into Database – SQL in Sixty Seconds #018 – Video

    - by pinaldave
    Importing data into database is one of the most important tasks. I often receive questions regarding what is the quickest way to insert CSV data or how to import CSV Data into SQL Server Table. Honestly the process is very simple and the script is even simpler. In today’s SQL in Sixty Seconds Video we will learn how quickly we can insert CSV data into SQL Server. The steps to import CSV are very simple. Create Table Use Bulk Insert to import the data Verify the data Done! Absolutely it is that simple. More on Importing CSV Data: SQL SERVER – Import CSV File Into SQL Server Using Bulk Insert – Load Comma Delimited File Into SQL Server SQL SERVER – Import CSV File into Database Table Using SSIS SQL SERVER – Create a Comma Delimited List Using SELECT Clause From Table Column SQL SERVER – Comma Separated Values (CSV) from Table Column SQL SERVER – Comma Separated Values (CSV) from Table Column – Part 2 I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • It’s On! Oracle Open World 2012 Opens Call for Papers is Open

    - by David Hope-Ross
    Oracle OpenWorld is among the world’s largest industry events for good reason. It offers a vast array of learning and networking opportunities in one of the planet’s great cities.  And one of the key reasons for its popularity among procurement and supply chain professionals is the prominence of presentations by customers.   If you’d like to deliver a presentation based on your experience, now is the time to submit your abstract for review by the selection panel. The competition is strong: roughly 18% of entries are accepted each year from more than 3,000 submissions. Review panels are made up of experts both internal and external to Oracle. Successful submissions often (but not exclusively) focus on customer successes, how-tos, or best practices. What’s in it for you? Recognition, for one thing. Accepted sessions are publicized in the content catalog, which goes live in mid-June, and sessions given by external speakers often prove the most popular. Plus, accepted speakers get a complimentary pass to Oracle OpenWorld with access to all sessions and networking events- that could save you up to $2,595!   Be sure designate your session for inclusion in the correct track by selecting  “APPLICATIONS: Supply Chain Management” or “APPLICATIONS: Sourcing and Procurement” from the Primary Track drop down menu.   We look forward to seeing you in San Francisco!

    Read the article

  • book about psychology of decision and psychology of human

    - by boos
    I'm a unix developer and i want to make career in project/people management as first step. I think sometimes is better to have good communication skill and in general more human skill to make career more fast. Almost in Italy, a lot of people made career development more fast for his human skill and not for his technical skill. Anyone have read some book about psychology to better manage how people and personality work and to exploit decision making situation in the right way? I have found some interesting book about people personality and psychology of decision, but i am in doubt about the usefulness about reading such book. anyone have some experience in this path ? Anyone have found useful to read similar book about how people work, to manage career development in a more fast way and handle people and decision in a more useful way? i have already read peopleware. The table of content of one of this book have: 1 - Judicment and decision 2 - Euristics and sistematics error 3 - Estimating probability and frequency prediction 4 - Risk and decision 5 - rappresentation and decision 6 - Memory, attention and decision. Etc. what do you think about ?

    Read the article

  • Page debugging got easier in UCM 11g

    - by kyle.hatlestad
    UCM is famous for it's extra parameters you can add to the URL to do different things. You can add &IsJava=1 to get all of the local data and result set information that comes back from the idc_service. You can add &IsSoap=1 and get back a SOAP message with that information. Or &IsJson=1 will send it in JSON format. There are ones that change the display like &coreContentOnly=1 which will hide the footer and navigation on the page. In 10g, you could add &ScriptDebugTrace=1 and it would display the list of resources that were called through includes or eval functions at the bottom of the page. And it would list them in nested order so you could see the order in which they were called and which components overrode each other. But in 11g, that parameter flag no longer works. Instead, you get a much more powerful one called &IsPageDebug=1. When you add that to a page, you get a small gray tab at the bottom right-hand part of the browser window. When you click it, it will expand and let you choose several pieces of information to display. You can select 'idocscript trace' and display the nested includes you used to get with ScriptDebugTrace. You can select 'initial binder' and see the local data and result sets coming back from the service, just as you would with IsJava. But in this display, it formats the results in easy to read tables (instead of raw HDA format). Then you can get the final binder which would contain all of the local data and result sets after executing all of the includes for the display of the page (and not just from the Service call). And then a 'javascript log' for reporting on the javascript functions and times being executed on the page. Together, these new data displays make page debugging much easier in 11g. *Note: This post also applies to Universal Records Management (URM).

    Read the article

  • MBA versus MSIS

    - by user794684
    I am considering going back to school for my masters and I've been looking at several avenues I can take. I've been considering either an MBA or an MSIS degree. Overall I know that an MBA is going to give me a solid skill set that can help me become an executive. However they seem to be a dime a dozen these days and the University I can get into is good, but it's not exactly in the top 100 anything. My undergrad MINOR was in Business Information Systems. I'm rusty as hell, considering I haven't touched it, but an MSIS would be more in the direction of my past academic experience and seems to touch both on business management and IT. Question... With an MSIS will I just be a middleman? Will I really be an important person with a real skill set or will I merely be someone who isn't quite cut out to be a manager and who is clueless about the tech side? Is an MSIS degree going to give me a real chance to move up the pay scale quickly or am I better off learning programing, networking through another BS degree? What will give me more upward mobility career wise? An MBA or an MSIS?

    Read the article

  • SQL SERVER – Effect of Collation on Resultset – SQL in Sixty Seconds #026 – Video

    - by pinaldave
    Collation is a very important concept but often ignored. I have often seen developers either not understanding this or ignored it – this is plain wrong. In simple word we can say Collation is the language or interpreting done by SQL Server. Well, in today’s SQL in Sixty Seconds we are going to observe how collation affects the resultset. Today’s blog post is inspired from my earlier blog post SQL SERVER – Effect of Case Sensitive Collation on Resultset. I strongly encourage you to read this earlier blog post for sample code as well additional explanation related to the concept shared in today’s SQL in Sixty Seconds. Here is the code used in the video. USE TempDB GO -- Sample Data Building CREATE TABLE ColTable (Col1 VARCHAR(15) COLLATE Latin1_General_CI_AS, Col2 VARCHAR(14) COLLATE Latin1_General_CS_AS) ; INSERT ColTable(Col1, Col2) VALUES ('Apple','Apple'), ('apple','apple'), ('pineapple','pineapple'), ('Pineapple','Pineapple'); GO -- Retrieve Data SELECT * FROM ColTable GO -- Retrieve Data SELECT * FROM ColTable ORDER BY Col1 GO -- Retrieve Data SELECT * FROM ColTable ORDER BY Col2 GO -- Clean up DROP TABLE ColTable GO Related Tips in SQL in Sixty Seconds: SQL SERVER – Effect of Case Sensitive Collation on Resultset Example of Width Sensitive and Width Insensitive Collation Collation and Collation Sensitivity – Quiz – Puzzle – 6 of 31 Change Collation of Database Column – T-SQL Script Find Collation of Database and Table Column Using T-SQL Default Collation of SQL Server 2008 Cannot resolve collation conflict for equal to operation If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • SQL SERVER – Identify Most Resource Intensive Queries – SQL in Sixty Seconds #029 – Video

    - by pinaldave
    There are a few questions I often get asked. I wonder how interesting is that in our daily life all of us have to often need the same kind of information at the same time. Here is the example of the similar questions: How many user created tables are there in the database? How many non clustered indexes each of the tables in the database have? Is table Heap or has clustered index on it? How many rows each of the tables is contained in the database? I finally wrote down a very quick script (in less than sixty seconds when I originally wrote it) which can answer above questions. I also created a very quick video to explain the results and how to execute the script. Here is the complete script which I have used in the SQL in Sixty Seconds Video. SELECT [schema_name] = s.name, table_name = o.name, MAX(i1.type_desc) ClusteredIndexorHeap, COUNT(i.TYPE) NoOfNonClusteredIndex, p.rows FROM sys.indexes i INNER JOIN sys.objects o ON i.[object_id] = o.[object_id] INNER JOIN sys.schemas s ON o.[schema_id] = s.[schema_id] LEFT JOIN sys.partitions p ON p.OBJECT_ID = o.OBJECT_ID AND p.index_id IN (0,1) LEFT JOIN sys.indexes i1 ON i.OBJECT_ID = i1.OBJECT_ID AND i1.TYPE IN (0,1) WHERE o.TYPE IN ('U') AND i.TYPE = 2 GROUP BY s.name, o.name, p.rows ORDER BY schema_name, table_name Related Tips in SQL in Sixty Seconds: Find Row Count in Table – Find Largest Table in Database Find Row Count in Table – Find Largest Table in Database – T-SQL Identify Numbers of Non Clustered Index on Tables for Entire Database Index Levels, Page Count, Record Count and DMV – sys.dm_db_index_physical_stats Index Levels and Delete Operations – Page Level Observation What would you like to see in the next SQL in Sixty Seconds video? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video Tagged: Excel

    Read the article

  • SQL SERVER – Get Date and Time From Current DateTime – SQL in Sixty Seconds #025 – Video

    - by pinaldave
    This is 25th video of series SQL in Sixty Seconds we started a few months ago. Even though this is 25th video it seems like we have just started this few days ago. The best part of this SQL in Sixty Seconds is that one can learn something new in less than sixty seconds. There are many concepts which are not new for many but just we all have 60 seconds to refresh our memories. In this video I have touched a very simple question which I receive very frequently on this blog. Q1) How to get current date time? Q2) How to get Only Date from datetime? Q3) How to get Only Time from datetime? I have created a sixty second video on this subject and hopefully this will help many beginners in the SQL Server field. This sixty second video describes the same. Here is a similar script which I have used in the video. SELECT GETDATE() GO -- SQL Server 2000/2005 SELECT CONVERT(VARCHAR(8),GETDATE(),108) AS HourMinuteSecond, CONVERT(VARCHAR(8),GETDATE(),101) AS DateOnly; GO -- SQL Server 2008 Onwards SELECT CONVERT(TIME,GETDATE()) AS HourMinuteSeconds; SELECT CONVERT(DATE,GETDATE()) AS DateOnly; GO Related Tips in SQL in Sixty Seconds: Retrieve Current Date Time in SQL Server CURRENT_TIMESTAMP, GETDATE(), {fn NOW()} Get Time in Hour:Minute Format from a Datetime – Get Date Part Only from Datetime Get Current System Date Time Get Date Time in Any Format – UDF – User Defined Functions Date and Time Functions – EOMONTH() – A Quick Introduction DATE and TIME in SQL Server 2008 I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. If we like your idea we promise to share with you educational material. Image Credit: Movie Gone in 60 Seconds Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • SQL SERVER – Three Methods to Insert Multiple Rows into Single Table – SQL in Sixty Seconds #024 – Video

    - by pinaldave
    One of the biggest ask I have always received from developers is that if there is any way to insert multiple rows into a single table in a single statement. Currently when developers have to insert any value into the table they have to write multiple insert statements. First of all this is not only boring it is also very much time consuming as well. Additionally, one has to repeat the same syntax so many times that the word boring becomes an understatement. In the following quick video we have demonstrated three different methods to insert multiple values into a single table. -- Insert Multiple Values into SQL Server CREATE TABLE #SQLAuthority (ID INT, Value VARCHAR(100)); Method 1: Traditional Method of INSERT…VALUE -- Method 1 - Traditional Insert INSERT INTO #SQLAuthority (ID, Value) VALUES (1, 'First'); INSERT INTO #SQLAuthority (ID, Value) VALUES (2, 'Second'); INSERT INTO #SQLAuthority (ID, Value) VALUES (3, 'Third'); Clean up -- Clean up TRUNCATE TABLE #SQLAuthority; Method 2: INSERT…SELECT -- Method 2 - Select Union Insert INSERT INTO #SQLAuthority (ID, Value) SELECT 1, 'First' UNION ALL SELECT 2, 'Second' UNION ALL SELECT 3, 'Third'; Clean up -- Clean up TRUNCATE TABLE #SQLAuthority; Method 3: SQL Server 2008+ Row Construction -- Method 3 - SQL Server 2008+ Row Construction INSERT INTO #SQLAuthority (ID, Value) VALUES (1, 'First'), (2, 'Second'), (3, 'Third'); Clean up -- Clean up DROP TABLE #SQLAuthority; Related Tips in SQL in Sixty Seconds: SQL SERVER – Insert Multiple Records Using One Insert Statement – Use of UNION ALL SQL SERVER – 2008 – Insert Multiple Records Using One Insert Statement – Use of Row Constructor I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • SQL SERVER – Cardinality Estimation and Performance – SQL in Sixty Seconds #072

    - by Pinal Dave
    Yesterday I wrote blog post based on my latest Pluralsight course on learning SQL Server 2014. I discussed newly introduced cardinality estimation in SQL Server 2014 and how it improves the performance of the query. The cardinality estimation logic is responsible for quality of query plans and majorly responsible for improving performance for any query. This logic was not updated for quite a while, but in the latest version of SQL Server 2104 this logic is re-designed. The new logic now incorporates various assumptions and algorithms of OLTP and warehousing workload. I hope my earlier blog post clearly explained how new cardinality estimation logic improves performance. If not, I suggest you watch following quick video where I explain this concept in extremely simple words. You can download the code used in this course from Simple Demo of New Cardinality Estimation Features of SQL Server 2014. Action Item Here are the blog posts I have previously written. You can read it over here: Simple Demo of New Cardinality Estimation Features of SQL Server 2014 Pluralsight Course You can subscribe to my YouTube Channel for frequent updates. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Video

    Read the article

  • IT Optimization Plan Pays Off For UK Retailer

    - by Brian Dayton
    I caught this article in ComputerworldUK yesterday. The headline talks about UK-based supermarket chain Morrisons is increasing their IT spend...OK, sounds good. Even nicer that Oracle is a big part of that. But what caught my eye were three things: 1) Morrison's truly has a long term strategy for IT. In this case, modernizing and optimizing how they use IT for business advantage.   2) Even in a tough economic climate, Morrison's views IT investments as contributing to and improving the bottom line. Specifically, "The investment in IT contributed to a 21 percent increase in Morrison's underlying profit.."   3) The phased, 3-year "Optimization Plan" took a holistic approach to their business--from CRM and Supply Chain systems to the underlying application infrastructure. On the infrastructure front, adopting a more flexible Service-Oriented Architecture enabled them to be more agile and adapt their business and Identity Management helped with sometimes mundane (but costly) issues like lost passwords and being able to document who has access to what.   Things don't always turn out so rosy. And I know it was a long and difficult process...but it's nice to see a happy ending every once in a while.  

    Read the article

  • SQL SERVER – Changing Default Installation Path for SQL Server

    - by pinaldave
    Earlier I wrote a blog post about SQL SERVER – Move Database Files MDF and LDF to Another Location and in the blog post we discussed how we can change the location of the MDF and LDF files after database is already created. I had mentioned that we will discuss how to change the default location of the database. This way we do not have to change the location of the database after it is created at different locations. The ideal scenario would be to specify this default location of the database files when SQL Server Installation was performed. If you have already installed SQL Server there is an easy way to solve this problem. This will not impact any database created before the change, it will only affect the default location of the database created after the change. To change the default location of the SQL Server Installation follow the steps mentioned below: Go to Right Click on Servers >> Click on Properties >> Go to the Database Settings screen You can change the default location of the database files. All the future database created after the setting is changed will go to this new location. You can also do the same with T-SQL and here is the T-SQL code to do the same. USE [master] GO EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultData', REG_SZ, N'F:\DATA' GO EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultLog', REG_SZ, N'F:\DATA' GO What are the best practices do you follow with regards to default file location for your database? I am interested to know them. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Manage ClickOnce releases for different parties

    - by Dirk Beckmann
    I'm struggling with release management of a piece of software. First some general information: It is a ClickOnce application I follow the release often practice There are about 30 parties served with this software I need full control which update will be delivered to which party Not each party is allowed to get the latest update/release Each party has multiple clients that are all allowed to get the latest update, served for the specific party So that's what my requirements are in a rough description. So let me explain what I was thinking about how to solve this. I would like to create a "deployment" website (asp.net) that will handle all the requests There are two endpoints one for download the client and one where the client checks for updates So each party has a separate endpoint like DeploymentSite/party1 and another for DeploymentSite/party2 The Application Files should still be stored centralized So I thought it would be manageable with mage.exe with the following steps Build application and store new release into Application Files Repository/Folder Get parties that should be updated (config file, database what ever) Run mage.exe to create a new application and deployment manifest for each party in the update list with new Application File Location (1.0.2) Actually I'm really struggling with this mage.exe staff. I can't create the appropriate files with the needed codebase. How to handle thes requirements?

    Read the article

  • How do managers know if a person is a good or a bad programmer?

    - by Pavel Shved
    In most companies that do programming teams and divisions consist of programmers who design and write code and managers who... well, do the management stuff. Aside from just not writing code, managers usually do not even look at the code the team develops, and may even have no proper IDE installed on their work machines. Still, the managers are to judge if a person works well, if he or she should be put in charge of something, or if particular developer should be assigned to a task of the most importance and responsibility. And last, but not least: the managers usually assign the quarterly bonuses! To do the above effectively, a manager should know if a person is a good programmer—among other traits, of course. The question is, how do they do it? They don't even look at the code people write, they can't directly assess the quality of the components programmers develop... but their estimates of who is a good coder, and who is "not as good" are nevertheless correct in most cases! What is the secret?

    Read the article

  • Oracle Number One in Supply Chain Planning

    - by Stephen Slade
    Something nice to write home about!  Saw this accomplishment and worth promoting, with special Congrats to the VCP team. Read on: Summary: Oracle is the #1 player in  Supply Chain Planning  according to research firm ARC Advisory Group Details: The report (Source: ARC Advisory Group, “Supply Chain Planning Worldwide Outlook, Market Analysis and Forecast through 2016,” Clint Reiser, Steve Banker), gives Oracle 21.1% of revenue share, compared to SAP, who was second at 18.6%. JDA Software, Aspen, Logility, and Infor were the next players in the market. The total market was valued at $1.506B. ARC counts Software (new license and upgrades), Implementation Services, Maintenance and Support, and SaaS, in its definition. ARC defines supply chain planning to include four key application areas: Extended SCP, Manufacturing Planning, Inventory/Distribution Planning, and Demand Management. Extended SCP consists of Network Design, Capable to Promise, SCP Composites, and Extended Supply Chain BI software. In the report, ARC further gives Oracle the number one spot in both Software Revenues and Services Revenues subsegments, as well as in many vertical areas such as Government, Electronics and Electrical, Medical Products, Pharmaceutical, and Wholesale/Distribution. ARC also issued a forecast, that predicts SCP revenue to grow from $1.506B in 2011 to $2.172B in 2016, with a CAGR of 7.6%. The report has several positive quotes about Oracle, including calling Oracle a “visionary,” and states that “Oracle has leveraged a broad set of home-grown and acquired offerings to create a comprehensive, integrated, yet modular suite with applicability to a wide range of industries,” Blog Link: http://blog.us.oracle.com/marketdata/?97119896  (shawn willett@oracle com)

    Read the article

  • What You Said: How You Track Your Time

    - by Jason Fitzpatrick
    Earlier this week we asked you to share your favorite time tracking tips, tricks, and tools. Now we’re back to highlight the techniques HTG readers use to keep tabs on their time. While more than one of you expressed confusion over the idea of tracking how you spend all your time, many of you were more than happy to share the reasons for and the methods you use to stay on top of your time expenditures. Scott uses a fluid and flexible project management tool: I use kanbanflow.com, with two boards to manage task prioritisation and backlog. One board called ‘Current Work’ has three columns ‘Do Today’, ‘In Progress’ and ‘Done’. The other is called ‘Backlog’, which splits tasks into priority groups – ‘Distractions (NU+NI)’, ‘Goals (NU+I)’, ‘Interruptions (U+NI)’, ‘Interruptions (U+NI)’ and ‘Critical (U+I)’, where U is Urgent and I is Important (and N is Not). At the end of each day, I move things from my Backlog to my ‘Current Work’ board, with the idea to keep complete Goals before they become Critical. That way I can focus on ‘Current Work’ Do Today so I don’t feel overwhelmed and can plan my day. As priorities change or interruptions pop up, it’s just a matter of moving tasks between boards. I have both tabs open in my browser all day – this is probably good for knowledge workers strapped to their desk, not so good for those in meetings all day. In that case, go with the calendar on your phone. While the above description might make it sound really technical, we took the cloud-based app for a spin and found the interface to be very flexible and easy to use. Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer Why Enabling “Do Not Track” Doesn’t Stop You From Being Tracked

    Read the article

  • OpenWorld Approaching... A few opportunities to share your needs with Oracle

    - by RichMill
    At OpenWorld from Monday the 1st to Wed. the 3rd. The My Oracle Support and Enterprise Manager user research team will be in action.  If you are someone who does patching, edits configurations, or uses either MOS configuration management (the collector) OR Enterprise Manager configuration compare or search, we have a treat for you!  Come give us your feedback on how you do your tasks, what needs you have, and how we can do better in this space. We will be doing this during OOW, but an OOW badge is not required to participate.  OR If you are someone who downloads large amounts of software (say, the entire EBS stack) and wants to understand how one customize a "recommended" stack of software for yourself, or your customers, let us know!  We have a study looking at how to create, customize and download all of the software needed for an installation. This will be done after OOW via webconference, so customers from anywhere in the world can participate. We want to hear from you, so we can get this right! E-mail us directly at [email protected] - or leave a comment with your email, so we can get your feedback into one or both of these two discussions. Hope you can participate!

    Read the article

  • How do developers verify that software requirement changes in one system do not violate a requirement of downstream software systems?

    - by Peter Smith
    In my work, I do requirements gathering, analysis and design of business solutions in addition to coding. There are multiple software systems and packages, and developers are expected to work on any of them, instead of being assigned to make changes to only 1 system or just a few systems. How developers ensure they have captured all of the necessary requirements and resolved any conflicting requirements? An example of this type of scenario: Bob the developer is asked to modify the problem ticket system for a hypothetical utility repair business. They contract with a local utility company to provide this service. The old system provides a mechanism for an external customer to create a ticket indicating a problem with utility service at a particular address. There is a scheduling system and an invoicing system that is dependent on this data. Bob's new project is to modify the ticket placement system to allow for multiple addresses to entered by a landlord or other end customer with multiple properties. The invoicing system bills per ticket, but should be modified to bill per address. What practices would help Bob discover that the invoicing system needs to be changed as well? How might Bob discover what other systems in his company might need to be changed in order to support the new changes\business model? Let's say there is a documented specification for each system involved, but there are many systems and Bob is not familiar with all of them. End of example. We're often in this scenario, and we do have design reviews but management places ultimate responsibility for any defects (business process or software process) on the developer who is doing the design and the work. Some organizations seem to be better at this than others. How do they manage to detect and solve conflicting or incomplete requirements across software systems? We currently have a lot of tribal knowledge and just a few developers who understand the entire business and software chain. This seems highly ineffective and leads to problems at the requirements level.

    Read the article

  • ScreenManagement better practices ?! Textbox not focusing

    - by xykudyax
    I saw a question here using DataTemplates with WPF for ScreenManagement, I was curious and I gave it a try I think the ideia is amazing and very clean. Though I'm new to WPF and I read a lot of times that almost everything should be made in XAML and very little should be "coded behind". My questions resolves about using the datatemplate ideia, WHERE should the code that calls the transitions be? where should I define which commands are avaiable in which screens. For example: [ScreenA] Commands: Pressing B - Goes to state B Pressing ESC - Exits [ScreenB] Commands: Pressing A - Goes to state A Pressing SPACE - Exits where do I define the keyEventHandlers? and where do I call the next screen? I'm doing this as an hobby for learning and "if you are learning, better learn it right" :) Thank you for your time. Yes the Q/A I was talking is: What's a good way to handle game screen management in WPF? What I've done so far was to create a Screen class (derived from UserControl) and create some virtual methods: - one for Initializing stuff (like focus a given component by default) - another for inputHandling I handle it by using a switch case and by listening to the PreviewKeyDown event from the parent container (MainWindow) Im not able to do it another way! Help?!. - and a finally one that removes the keyEvent method (when the screen is terminated) Parent.PreviewKeyDown -= OnKeyDown; am I doing okay? I face a problem. When I add a new screen (userControl) containing a TextBox I'm not able to give it autofocus :/ The Caret is there but is not blinking and I have to hit "TAB" before being able to input anything at all :/

    Read the article

  • How to build Gantt chart from a set of Redmine tickets without filling dates in all of them?

    - by Alexander Gladysh
    Redmine 1.1.1 I've created a set of tickets for a new project. In each issue I filled Subject, Description and Estimated time fields. I also filled blocks/blocked by dependencies in Related issues. But the Gantt chart for this project is empty (that is, it contains all the tasks, but does not contain any "bars" for them). I need to get a Gantt chart (or any other visual representation) to show to other project members. I'd hate to type all that information again into OpenProj. Is there a way to get a serviceable Gantt chart from the Redmine? Update: In the answers below I read that to get working Gantt chart I have to input start date and due date manually for each issue. I believe that this information should be inferred automatically from start date of first ticket (first — depenency-wise), estimated time of each ticket, dependency graph, resource assignment and working hours calendar. Just as it happens in any minimally sane Gantt chart project management tool. To enter this information by hand and to keep it up-to-date manually as the project evolves is insane waste of time. Is there a way to generate Gantt chart from the set of Redmine tickets without filling in all this information manually? (Solutions involving data export + import in sane tool or involving existing plugins are perfectly acceptable.)

    Read the article

  • IT Optimization Plan Pays Off For UK Retailer

    - by [email protected]
    I caught this article in ComputerworldUK yesterday. The headline talks about UK-based supermarket chain Morrisons is increasing their IT spend...OK, sounds good. Even nicer that Oracle is a big part of that. But what caught my eye were three things: 1) Morrison's truly has a long term strategy for IT. In this case, modernizing and optimizing how they use IT for business advantage. 2) Even in a tough economic climate, Morrison's views IT investments as contributing to and improving the bottom line. Specifically, "The investment in IT contributed to a 21 percent increase in Morrison's underlying profit.." 3) The phased, 3-year "Optimization Plan" took a holistic approach to their business--from CRM and Supply Chain systems to the underlying application infrastructure. On the infrastructure front, adopting a more flexible Service-Oriented Architecture enabled them to be more agile and adapt their business and Identity Management helped with sometimes mundane (but costly) issues like lost passwords and being able to document who has access to what. Things don't always turn out so rosy. And I know it was a long and difficult process...but it's nice to see a happy ending every once in a while.

    Read the article

  • CEO Taken Captive in His Own Factory?

    - by Stephen Slade
    Last Friday was no ordinary day for Chip Starnes, the 42 year old factory owner of Specialty Medical Supplies in China. He recently announced movement of some of the production of their diabetes testing equipment from Beijing to Mumbai India.  Of the 110 employees at the facility, about 80 protested by blocking the doors and refusing to let Chip Starnes out of the facility.  He has been trapped in his office several days now.  The employees think the factory was closing but Mr. Starnes said it was not. Mis-information? Poor communications? Work-stoppage. This is a good example of supply chain disruption. Parked cars are blocking the entrance to the facility, front gates are chained close, the CEO a prisoner in his own factory. Chip Starnes was presented with documents to sign in Chinese indicating he would pay severance and other demands he did not understand, possibly bankrupting the company.    If you depend on supply from China and other foreign suppliers, how reliable are your sources? For example how are the shopfloor employee relations? Is it possible to predict these types of HR risks and plan around them? What are your contingencies? It's important to ask the right questions and hear good answers. Having tools in place to rapidly evaluate, assess and react to these disruptions are the keys to survival. Hear how leading organizations are reinforcing their supply chains and mitigating risk through technology with Oracle's latest release of Oracle Supply Chain Management. Source: WSJ pg.B1, June 25, 2013

    Read the article

  • using a wiki for requirements

    - by apollodude217
    Hi, I'm looking into ways of improving requirements management. Currently, we have a Word document published on a Web site. Unfortunately, we cannot (to my knowledge) look at changes from one revision to the next. I would greatly prefer to be able to do so, much like with a wiki or VCS (or both, like the wiki's on bitbucket!). Also, each document describes changes devs are expected to meet by a given deadline. There is no collection of accumulated app features documented anywhere, so it's sometimes hard to distinguish between a bug and a (poorly-designed) feature when trying to make quick fixes to legacy apps. So I had an idea I wanted to get feedback on. What about: Using a wiki so that we can track who changed what when (mostly to even see if any edits were made since the last time one looked). Having one, say, wiki page per product rather than one per deadline, keeping up with all features of the product rather than the changes that should be implemented. This way, I can look at a particular revision of the page to see what the app should do at a given point in time, and I can look at changes to the page since the last release for the requirements to be implemented by the next deadline. Waddayathink?

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >