Search Results

Search found 22961 results on 919 pages for 'memory management'.

Page 73/919 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • What CI server and Configuration Management tools I should use

    - by Bera
    Hi ! What CI server and Configuration Management tools I should use together for a truly development and deploy maintenance. There isn't the de facto rails sustainable environment, is there? Some assumptions: • code control version ok - git (de facto tool) • test framework ok - whatever (rspec is my choice) • code coverage and analysis ok - whatever (metric-fu, for example) • server stack ok - (Passenger for example) • issue tracker (RedMine) • etc, ... I'm want to play if integrity and moonshine projects, for me it's a good for beginning, isn't it? What do you think about this? Thanks, Bruno

    Read the article

  • 3rd Party Document Management Service

    - by Element
    I am developing an asp.net application that requires users to upload/view various documents. Rather then reinvent the wheel I was thinking about using a 3rd party service like Scribd to handle these documents and integrate it into my app via their API; I really like their ipaper viewer too. My concern is some of these documents will be sensitive data. Even though Scribd's FAQ says they are equipped to handle sensitive information, I am a little hesitant to trust an unpaid service that lacks an SLA. Has anyone used Scribd successfully for a similar task? Or can anyone recommend a better document management service?

    Read the article

  • SQL Server 2008 Management Studio doesn't recognize new Schema

    - by Lieven Cardoen
    I have created a new Schema in a database called Contexts. Now when I want to write a query, Management Studio doesn't recognize the tables that belong to the new Schema. It says: 'Invalid object name Contexts.ContextLibraries'... Transact-SQL: INSERT INTO [Contexts].[ContextLibraries] (ChannelId, [IsSystem]) VALUES (@ChannelId, 1) When I try the same thing on my local database, it does work... Any ideas? I did try to change the Default schema for the user from dbo to Contexts but this doesn't work. Also checked Contexts in Schemas owned by this user without success. Update: Apparently the sql query does work but the editor gives a fault saying the object is invalid.

    Read the article

  • Event based PROJECT MANAGEMENT

    - by andreas
    Hey fellas! Where i am working we have a number of contractor programmers. We handle the requirements and the project management of our products. I have been trying using various project timing and estimations techniques but cant get the hang of it yet. I have read Joel's evidence based scheduling and i was wondering if there's anything out there that can help me apply that theory? not looking for complex software but perhaps something in excel? Any help will be appreciated Andreas

    Read the article

  • Looking for Light Time Management Software Suggestions (for Mac)

    - by tmo256
    I'm looking for a simple project management app that performs task scheduling, along the line of Merlin or MS Project, but no where near as robustly. I don't need to deal with other (human) resources, but I work on anything from 3 to 6 different projects at a time. What I'd like is to be able to input deadlines and tasks, and have a schedule suggested to complete them. I do technical work, but I don't think I need anything specifically for software development, especially considering I do plenty of other kinds of things, like graphic design and social media PR. I'd really like this to be dead simple, as simple as possible. Suggestions? OmniPlan, something web-based? Definitely cannot afford anything too extravagant, really looking for something under $200. Thanks for your input!

    Read the article

  • Group SQL tables in SQL Server Management Studio object explorer

    - by MainMa
    I have a table which has approximately sixty tables, and other tables are added constantly. Each table is a part of a schema. A such quantity of tables makes it difficult to use Microsoft SQL Server Management Studio 2008. For example, I must scroll up in object explorer to access database related functions, or scroll down each time I need to access Views or Security features. Is it possible to group several tables to be able to expand or collapse them in Object Explorer? Maybe a folder may be displayed for each schema, letting collapse the folders I don't need to use?

    Read the article

  • Project tracking/management tool

    - by Alvaro Rodriguez
    Which project tracking tool do you use? Does it allow programmers to bill hours worked to projects/tasks? Does it allow to track items promised vs. items delivered? Does it allow to forecast personnel needs when you have only a ballpark estimate of how many hours you are going to need for different task types? Does it integrate with your bug tracker? [Mantis] Does it integrate with your source control tool? [Subversion] Does it allow you to easily publish your schedule and current priorities to team members? Does it produce reports on any or all of the above? Am I even right in calling a tool that does those things a "Project tracking/management tool"? What does your tool have that makes you love it and use it every day? I don't really care about Gantt charts. I find Microsoft Project quite clunky for this needs (although I'm hardly an expert user).

    Read the article

  • Problem exporting SQL Server management Studio Express to Go Daddy

    - by brohjoe
    I'm having a terrible time exporting SQL Server Management Studio Express tables to the Go Daddy webserver. Go Daddy support can't help either. I started by using Microsoft Database Publishing Wizard for SQL Server thinking it would be 'easy'....not! I ran into user/password errors even though I was using the user and password that was created for the SQL database in the Go Daddy site. I called help desk support at Go Daddy and went through several iterations of processes to get the thing working but it didn't. Finally, the support guy acted like his phone went on the blip and scuttled away. There has got to be someway to upload SQL Server to a webserver without a lot of drama. Any suggestions?

    Read the article

  • Management Studio default file save location

    - by jayrdub
    Open a new query window. Write some SQL. Save the script, the Save File As dialog box opens - but always to the same default location in the Profiles directory. Is there any way to set my default file location? ...Like I used to do with apps from the 1980s? Under Tools|Options a default location can be specified for query results. I need the same thing for new queries (the text editor). Tried changing locations in the Registry but SSMS just overwrote my changes. Any suggestions? (I saw this unanswered question at http://www.eggheadcafe.com/software/aspnet/30098335/management-studio-default.aspx and I had same exact question so I just reposted it here)

    Read the article

  • Sql server 2012 management studio is not showing me the Databasemail

    - by Sreejith
    I am trying to send mail through SQL server. But in my case when i expanded the SQL Server Logs i cannot find my Database mail. How can i do this Please help me guys. I tried so many ways for showing the database mail there.. i could get anything. Please help me guys. I tried the following codes too // To fix the run the following script: USE Master GO sp_configure 'show advanced options', 1 GO reconfigure with override GO sp_configure 'Database Mail XPs', 1 GO reconfigure GO sp_configure 'show advanced options', 0 GO Please find the below link to my snapshot of the SQL server 2012 management studio. I dont have Database mail there.. https://imageshack.com/i/f0xY6qH2p

    Read the article

  • Group SQL tables in Microsoft SQL Server Management Studio object explorer

    - by MainMa
    I have a table which has approximately sixty tables, and other tables are added constantly. Each table is a part of a schema. A such quantity of tables makes it difficult to use Microsoft SQL Server Management Studio 2008. For example, I must scroll up in object explorer to access database related functions, or scroll down each time I need to access Views or Security features. Is it possible to group several tables to be able to expand or collapse them in Object Explorer? Maybe a folder may be displayed for each schema, letting collapse the folders I don't need to use?

    Read the article

  • [PHP] - Lowering script memory usage in a "big" file creation

    - by Riccardo
    Hi there people, it looks like I'm facing a typical memory outage problem when using a PHP script. The script, originally developed by another person, serves as an XML sitemap creator, and on large websites uses quite a lot of memory. I thought that the problem was related due to an algorithm holding data in memory until the job was done, but digging into the code I have discovered that the script works in this way: open file in output (will contain XML sitemap entries) in the loop: ---- for each entry to be added in sitemap, do fwrite close file end Although there are no huge arrays or variables being kept in memory, this technique uses a lot of memory. I thought that maybe PHP was buffering under the hood the fwrites and "flushing" data at the end of the script, so I have modified the code to close and open the file every Nth record, but the memory usage is still the same.... I'm debugging the script on my computer and watching memory usage: while script execution runs, memory allocation grows. Is there a particular technique to instruct PHP to free unsed memory, to force flushing buffers if any? Thanks

    Read the article

  • ClearCase UCM Mainline Configuration Management Pattern Question

    - by cogmios
    A configuration management pattern question (using Rational ClearCase UCM) When I use the mainline approach I create new releases by: - create release 1 from mainline - on a certain moment baseline release 1, deliver release 1 to mainline - create release 2 from mainline - on a certain moment baseline release 2, deliver release 2 to mainline - create release 3 from mainline - etc... Works very nice because the pathname is /main/release 3/latest instead of /main/release 1/release 2/release 3/latest etc... However... when in release 1 are new elements that have to be propagated to later releases I can not use the mainline since the mainline is already on e.g. release 4. The only thing I can do is deliver/merge from release 1 directly to release 2. The bad thing is that the pathname then becomes /main/release 1/release 2/latest for that files (and possibly later releases). That is I think not in line with the mainline approach. What am I doing wrong?

    Read the article

  • Sql Server Management Studio 2008 - Insert string with more than two lines

    - by gsharp
    I want to paste a string with more than two lines into a nvarchar(max) cell (right click a Table in Sql Server Management Studio 2008 -- Edit Rows). Unfortunately only the the first line of the string is pasted into the cell. I know, I could write a Insert/Update script for that, but it's not what I'm looking for. I need a quick way to paste some text into some cell. Is there a way to achieve my goal? Thanks in advance.

    Read the article

  • user access management in j2ee web application

    - by kawtousse
    Hi everyone, I am working with jsp/servlet project and i have to complete the module of access management to my jsps since I have more than one user with different profile. I defined a table in my database wich resume the profil and the url permitted like that: id_profil :1 url : http://localhost/...xyz.jsp id page 1 Now I am trying to let the menu modified appropriately to the id_profil of the logged user. So there are pages allowed in one profile but must be hidden to others. I have no idea since now how to realize this and it is so important for me. thanks for your help.

    Read the article

  • content management system

    - by Farax
    I am building a website for a client who needs a content management system to go with it. The client requires features such as content staging, approving process before publishing a page, provision for templates (changing which changes the lay out for the whole website), entry and expiry dates for pages and content search. I am planning to use an existing opensource CMS for the work but I am confused as to which one should it be. I need help in deciding whether this approach is good or should i develop my own CMS? and if I do use an opensource one, which one is extensible and customisable enough using .NET?

    Read the article

  • Project management software

    - by fahadsadah
    Hello I am looking for a decent web application that performs project management, and am hoping you guys can help me out. Requirements: Free, open source software. Runs from a Linux server (no ASP.NET). Git integration. GitHub integration is a plus. Tracks bugs and feature requests. Version tracking/scheduling, ie being able to say that a feature will be implemented for version X. I was looking at Redmine, but I don't know about the last item. Is there a plugin for that, perhaps?

    Read the article

  • Session memory – who’s this guy named Max and what’s he doing with my memory?

    - by extended_events
    SQL Server MVP Jonathan Kehayias (blog) emailed me a question last week when he noticed that the total memory used by the buffers for an event session was larger than the value he specified for the MAX_MEMORY option in the CREATE EVENT SESSION DDL. The answer here seems like an excellent subject for me to kick-off my new “401 – Internals” tag that identifies posts where I pull back the curtains a bit and let you peek into what’s going on inside the extended events engine. In a previous post (Option Trading: Getting the most out of the event session options) I explained that we use a set of buffers to store the event data before  we write the event data to asynchronous targets. The MAX_MEMORY along with the MEMORY_PARTITION_MODE defines how big each buffer will be. Theoretically, that means that I can predict the size of each buffer using the following formula: max memory / # of buffers = buffer size If it was that simple I wouldn’t be writing this post. I’ll take “boundary” for 64K Alex For a number of reasons that are beyond the scope of this blog, we create event buffers in 64K chunks. The result of this is that the buffer size indicated by the formula above is rounded up to the next 64K boundary and that is the size used to create the buffers. If you think visually, this means that the graph of your max_memory option compared to the actual buffer size that results will look like a set of stairs rather than a smooth line. You can see this behavior by looking at the output of dm_xe_sessions, specifically the fields related to the buffer sizes, over a range of different memory inputs: Note: This test was run on a 2 core machine using per_cpu partitioning which results in 5 buffers. (Seem my previous post referenced above for the math behind buffer count.) input_memory_kb total_regular_buffers regular_buffer_size total_buffer_size 637 5 130867 654335 638 5 130867 654335 639 5 130867 654335 640 5 196403 982015 641 5 196403 982015 642 5 196403 982015 This is just a segment of the results that shows one of the “jumps” between the buffer boundary at 639 KB and 640 KB. You can verify the size boundary by doing the math on the regular_buffer_size field, which is returned in bytes: 196403 – 130867 = 65536 bytes 65536 / 1024 = 64 KB The relationship between the input for max_memory and when the regular_buffer_size is going to jump from one 64K boundary to the next is going to change based on the number of buffers being created. The number of buffers is dependent on the partition mode you choose. If you choose any partition mode other than NONE, the number of buffers will depend on your hardware configuration. (Again, see the earlier post referenced above.) With the default partition mode of none, you always get three buffers, regardless of machine configuration, so I generated a “range table” for max_memory settings between 1 KB and 4096 KB as an example. start_memory_range_kb end_memory_range_kb total_regular_buffers regular_buffer_size total_buffer_size 1 191 NULL NULL NULL 192 383 3 130867 392601 384 575 3 196403 589209 576 767 3 261939 785817 768 959 3 327475 982425 960 1151 3 393011 1179033 1152 1343 3 458547 1375641 1344 1535 3 524083 1572249 1536 1727 3 589619 1768857 1728 1919 3 655155 1965465 1920 2111 3 720691 2162073 2112 2303 3 786227 2358681 2304 2495 3 851763 2555289 2496 2687 3 917299 2751897 2688 2879 3 982835 2948505 2880 3071 3 1048371 3145113 3072 3263 3 1113907 3341721 3264 3455 3 1179443 3538329 3456 3647 3 1244979 3734937 3648 3839 3 1310515 3931545 3840 4031 3 1376051 4128153 4032 4096 3 1441587 4324761 As you can see, there are 21 “steps” within this range and max_memory values below 192 KB fall below the 64K per buffer limit so they generate an error when you attempt to specify them. Max approximates True as memory approaches 64K The upshot of this is that the max_memory option does not imply a contract for the maximum memory that will be used for the session buffers (Those of you who read Take it to the Max (and beyond) know that max_memory is really only referring to the event session buffer memory.) but is more of an estimate of total buffer size to the nearest higher multiple of 64K times the number of buffers you have. The maximum delta between your initial max_memory setting and the true total buffer size occurs right after you break through a 64K boundary, for example if you set max_memory = 576 KB (see the green line in the table), your actual buffer size will be closer to 767 KB in a non-partitioned event session. You get “stepped up” for every 191 KB block of initial max_memory which isn’t likely to cause a problem for most machines. Things get more interesting when you consider a partitioned event session on a computer that has a large number of logical CPUs or NUMA nodes. Since each buffer gets “stepped up” when you break a boundary, the delta can get much larger because it’s multiplied by the number of buffers. For example, a machine with 64 logical CPUs will have 160 buffers using per_cpu partitioning or if you have 8 NUMA nodes configured on that machine you would have 24 buffers when using per_node. If you’ve just broken through a 64K boundary and get “stepped up” to the next buffer size you’ll end up with total buffer size approximately 10240 KB and 1536 KB respectively (64K * # of buffers) larger than max_memory value you might think you’re getting. Using per_cpu partitioning on large machine has the most impact because of the large number of buffers created. If the amount of memory being used by your system within these ranges is important to you then this is something worth paying attention to and considering when you configure your event sessions. The DMV dm_xe_sessions is the tool to use to identify the exact buffer size for your sessions. In addition to the regular buffers (read: event session buffers) you’ll also see the details for large buffers if you have configured MAX_EVENT_SIZE. The “buffer steps” for any given hardware configuration should be static within each partition mode so if you want to have a handy reference available when you configure your event sessions you can use the following code to generate a range table similar to the one above that is applicable for your specific machine and chosen partition mode. DECLARE @buf_size_output table (input_memory_kb bigint, total_regular_buffers bigint, regular_buffer_size bigint, total_buffer_size bigint) DECLARE @buf_size int, @part_mode varchar(8) SET @buf_size = 1 -- Set to the begining of your max_memory range (KB) SET @part_mode = 'per_cpu' -- Set to the partition mode for the table you want to generate WHILE @buf_size <= 4096 -- Set to the end of your max_memory range (KB) BEGIN     BEGIN TRY         IF EXISTS (SELECT * from sys.server_event_sessions WHERE name = 'buffer_size_test')             DROP EVENT SESSION buffer_size_test ON SERVER         DECLARE @session nvarchar(max)         SET @session = 'create event session buffer_size_test on server                         add event sql_statement_completed                         add target ring_buffer                         with (max_memory = ' + CAST(@buf_size as nvarchar(4)) + ' KB, memory_partition_mode = ' + @part_mode + ')'         EXEC sp_executesql @session         SET @session = 'alter event session buffer_size_test on server                         state = start'         EXEC sp_executesql @session         INSERT @buf_size_output (input_memory_kb, total_regular_buffers, regular_buffer_size, total_buffer_size)             SELECT @buf_size, total_regular_buffers, regular_buffer_size, total_buffer_size FROM sys.dm_xe_sessions WHERE name = 'buffer_size_test'     END TRY     BEGIN CATCH         INSERT @buf_size_output (input_memory_kb)             SELECT @buf_size     END CATCH     SET @buf_size = @buf_size + 1 END DROP EVENT SESSION buffer_size_test ON SERVER SELECT MIN(input_memory_kb) start_memory_range_kb, MAX(input_memory_kb) end_memory_range_kb, total_regular_buffers, regular_buffer_size, total_buffer_size from @buf_size_output group by total_regular_buffers, regular_buffer_size, total_buffer_size Thanks to Jonathan for an interesting question and a chance to explore some of the details of Extended Event internals. - Mike

    Read the article

  • Partner Blog Series: PwC Perspectives - "Is It Time for an Upgrade?"

    - by Tanu Sood
    Is your organization debating their next step with regard to Identity Management? While all the stakeholders are well aware that the one-size-fits-all doesn’t apply to identity management, just as true is the fact that no two identity management implementations are alike. Oracle’s recent release of Identity Governance Suite 11g Release 2 has innovative features such as a customizable user interface, shopping cart style request catalog and more. However, only a close look at the use cases can help you determine if and when an upgrade to the latest R2 release makes sense for your organization. This post will describe a few of the situations that PwC has helped our clients work through. “Should I be considering an upgrade?” If your organization has an existing identity management implementation, the questions below are a good start to assessing your current solution to see if you need to begin planning for an upgrade: Does the current solution scale and meet your projected identity management needs? Does the current solution have a customer-friendly user interface? Are you completely meeting your compliance objectives? Are you still using spreadsheets? Does the current solution have the features you need? Is your total cost of ownership in line with well-performing similar sized companies in your industry? Can your organization support your existing Identity solution? Is your current product based solution well positioned to support your organization's tactical and strategic direction? Existing Oracle IDM Customers: Several existing Oracle clients are looking to move to R2 in 2013. If your organization is on Sun Identity Manager (SIM) or Oracle Identity Manager (OIM) and if your current assessment suggests that you need to upgrade, you should strongly consider OIM 11gR2. Oracle provides upgrade paths to Oracle Identity Manager 11gR2 from SIM 7.x / 8.x as well as Oracle Identity Manager 10g / 11gR1. The following are some of the considerations for migration: Check the end of product support (for Sun or legacy OIM) schedule There are several new features available in R2 (including common Helpdesk scenarios, profiling of disconnected applications, increased scalability, custom connectors, browser-based UI configurations, portability of configurations during future upgrades, etc) Cost of ownership (for SIM customers)\ Customizations that need to be maintained during the upgrade Time/Cost to migrate now vs. waiting for next version If you are already on an older version of Oracle Identity Manager and actively maintaining your support contract with Oracle, you might be eligible for a free upgrade to OIM 11gR2. Check with your Oracle sales rep for more details. Existing IDM infrastructure in place: In the past year and half, we have seen a surge in IDM upgrades from non-Oracle infrastructure to Oracle. If your organization is looking to improve the end-user experience related to identity management functions, the shopping cart style access request model and browser based personalization features may come in handy. Additionally, organizations that have a large number of applications that include ecommerce, LDAP stores, databases, UNIX systems, mainframes as well as a high frequency of user identity changes and access requests will value the high scalability of the OIM reconciliation and provisioning engine. Furthermore, we have seen our clients like OIM's out of the box (OOB) support for multiple authoritative sources. For organizations looking to integrate applications that do not have an exposed API, the Generic Technology Connector framework supported by OIM will be helpful in quickly generating custom connector using OOB wizard. Similarly, organizations in need of not only flexible on-boarding of disconnected applications but also strict access management to these applications using approval flows will find the flexible disconnected application profiling feature an extremely useful tool that provides a high degree of time savings. Organizations looking to develop custom connectors for home grown or industry specific applications will likewise find that the Identity Connector Framework support in OIM allows them to build and test a custom connector independently before integrating it with OIM. Lastly, most of our clients considering an upgrade to OIM 11gR2 have also expressed interest in the browser based configuration feature that allows an administrator to quickly customize the user interface without adding any custom code. Better yet, code customizations, if any, made to the product are portable across the future upgrades which, is viewed as a big time and money saver by most of our clients. Below are some upgrade methodologies we adopt based on client priorities and the scale of implementation. For illustration purposes, we have assumed that the client is currently on Oracle Waveset (formerly Sun Identity Manager).   Integrated Deployment: The integrated deployment is typically where a client wants to split the implementation to where their current IDM is continuing to handle the front end workflows and OIM takes over the back office operations incrementally. Once all the back office operations are moved completely to OIM, the front end workflows are migrated to OIM. Parallel Deployment: This deployment is typically done where there can be a distinct line drawn between which functionality the platforms are supporting. For example the current IDM implementation is handling the password reset functionality while OIM takes over the access provisioning and RBAC functions. Cutover Deployment: A cutover deployment is typically recommended where a client has smaller less complex implementations and it makes sense to leverage the migration tools to move them over immediately. What does this mean for YOU? There are many variables to consider when making upgrade decisions. For most customers, there is no ‘easy’ button. Organizations looking to upgrade or considering a new vendor should start by doing a mapping of their requirements with product features. The recommended approach is to take stock of both the short term and long term objectives, understand product features, future roadmap, maturity and level of commitment from the R&D and build the implementation plan accordingly. As we said, in the beginning, there is no one-size-fits-all with Identity Management. So, arm yourself with the knowledge, engage in industry discussions, bring in business stakeholders and start building your implementation roadmap. In the next post we will discuss the best practices on R2 implementations. We will be covering the Do's and Don't's and share our thoughts on making implementations successful. Meet the Writers: Dharma Padala is a Director in the Advisory Security practice within PwC.  He has been implementing medium to large scale Identity Management solutions across multiple industries including utility, health care, entertainment, retail and financial sectors.   Dharma has 14 years of experience in delivering IT solutions out of which he has been implementing Identity Management solutions for the past 8 years. Scott MacDonald is a Director in the Advisory Security practice within PwC.  He has consulted for several clients across multiple industries including financial services, health care, automotive and retail.   Scott has 10 years of experience in delivering Identity Management solutions. John Misczak is a member of the Advisory Security practice within PwC.  He has experience implementing multiple Identity and Access Management solutions, specializing in Oracle Identity Manager and Business Process Engineering Language (BPEL). Praveen Krishna is a Manager in the Advisory Security practice within PwC.  Over the last decade Praveen has helped clients plan, architect and implement Oracle identity solutions across diverse industries.  His experience includes delivering security across diverse topics like network, infrastructure, application and data where he brings a holistic point of view to problem solving. Jenny (Xiao) Zhang is a member of the Advisory Security practice within PwC.  She has consulted across multiple industries including financial services, entertainment and retail. Jenny has three years of experience in delivering IT solutions out of which she has been implementing Identity Management solutions for the past one and a half years.

    Read the article

  • Hekaton – SQL Server’s in-memory database engine

    - by Christian
    Microsoft have just gone public at the PASS Summit in Seattle about a new SQL Server engine that they’re working on which is optimized for high-memory servers – an in-memory OLTP database engine which is built-in to SQL Server rather than a separate entity.  This means that you can move just the performance critical parts of your database to Hekaton. The new engine really pushes the performance boundaries by eliminating as many instructions as possible: Main memory optimized tables which are decoupled from on-disk structures; Everything is lock and latch free; More work is pushed to compile time so your T-SQL code is compiled natively into low-level code. We’re already working with a customer on an early adoption program so expect to hear from us on what we learn about implementing it!   Christian Bolton - MCA, MCM, MVP Technical Director http://coeo.com - SQL Server Consulting & Managed Services

    Read the article

  • SQLAuthority News – Free Download – Microsoft SQL Server 2008 R2 RTM – Express with Management Tools

    - by pinaldave
    This blog post is in response to several inquiry about Free Download of SQL Server 2008 R2 RTM. Microsoft has announced SQL Server 2008 R2 as RTM (Release To Manufacture). Microsoft® SQL Server® 2008 R2 Express is a powerful and reliable data management system that delivers a rich set of features, data protection, and performance for embedded applications, lightweight Web Sites and applications, and local data stores. Download Microsoft SQL Server 2008 R2 RTM – Express with Management Tools. Download Microsoft SQL Server 2008 R2 RTM – Management Studio Express. Download SQL Server 2008 R2 Books Online. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Oracle Consulting North America is now live on PeopleSoft Services Procurement and PeopleSoft Resource Management

    - by Howard Shaw
    Last month, Oracle's own internal consulting group (OCS North America) went live on PeopleSoft Services Procurement and PeopleSoft Resource Management to manage all aspects of identifying, recruiting, and deploying billable subcontractors on North America Applications customer consulting projects. The primary goals were to enhance the subcontractor staffing process, improve operational and informational processes, and improve collaboration between the Oracle NA Consulting Subcontractor Program and subcontractor suppliers. Over 200 registered external suppliers access the tool, review open needs and competitively bid their resources to work on NA Applications projects. This implementation highlights the usage of Oracle’s own solutions to streamline and enhance business operations, as the PeopleSoft 9.1 applications (Services Procurement and Resource Management) were deployed using Sun hardware, Oracle Enterprise Linux, and Oracle Virtual Machines.For more information, please navigate to the following web pages: PeopleSoft Services Procurement PeopleSoft Resource Management

    Read the article

  • Best in-memory cache of DB objects for Silverlight [closed]

    - by Jon
    Hi, I'd like to set up a cache of database objects (i.e. rows in a table) in memory in silverlight, which I'll do using WCF and linq-to-sql. Once I have the objects in memory, I'm planning on using MSMQ to receive new objects whenever they have been modified. It's a somewhat complex approach but the goal is to reduce trips to the database and allow instant data communication between Silverlight applications that are connected to the MSMQ. My Silverlight applications are meant to be long-running and the amount of data to be cached will not be large. I'm planning on saving the in-memory cache using local storage. Anyway, in order to process the updated objects that come in, I'd like to know if the user has changed the existing object. Could I use some event relating to data-binding to set a flag indicating that the object has changes? Maybe there's a better way to do the cache entirely? Thanks!

    Read the article

  • Oracle’s new release of Primavera P6 Enterprise Portfolio Management

    It is estimated that projects totaling more than $6 trillion in value have been managed with Primavera products. Companies turn to Oracle's Primavera project portfolio management solutions to help them make better portfolio management decisions, evaluate the risks and rewards associated with projects, and determine whether there are sufficient resources with the right skills to accomplish the work. Tune into this conversation with Yasser Mahmud, Director of Product Strategy, for the Oracle Primavera Global Business Unit, to learn how P6 revolutionized project management, the new features in the release of Oracle Primavera P6 version 7 and how this newest release helps project-intensive businesses manage their entire project portfolio lifecycle, including projects of all sizes.

    Read the article

  • Ubuntu One using 500 MB memory also when idle

    - by cdysthe
    I'm a Dropbox convert (I hope!), but after having used Ubuntu One for a couple of weeks I notice a few differences from Dropbox. The most glaring difference is that the sync daemon constantly takes 500MB ram on my system (Ubuntu 12.04 x64). It hogs this amount of memory as soon as I log in, does it's initial sync/check but keeps the memory. All in all it seems to me that Ubuntu One uses more system resources than Dropbox. I am syncing the same folders and files with Ubuntu One as I was with Dropbox. Also, afte I log in Ubuntu One grids at 100% CPU for at least five minutes which can be annoying on a laptop, but is not a showstopper. I'm wondering if this is a problem on my system, or if Ubuntu One is expected to use that amount of memory even when idle?

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >