Search Results

Search found 30268 results on 1211 pages for 'sql 2011'.

Page 36/1211 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • Using SQL Developer to Debug your Anonymous PL/SQL Blocks

    - by JeffS
    Everyone knows that SQL Developer has a PL/SQL debugger – check! Everyone also knows that it’s only setup for debugging standalone PL/SQL objects like Functions, Procedures, and Packages, right? – NO! SQL Developer can also debug your Stored Java Procedures AND it can debug your standalone PLSQL blocks. These bits of PLSQL which do not live in the database are also known as ‘Anonymous Blocks.’ Anonymous PL/SQL blocks can be submitted to interactive tools such as SQL*Plus and Enterprise Manager, or embedded in an Oracle Precompiler or OCI program. At run time, the program sends these blocks to the Oracle database, where they are compiled and executed. Here’s an example of something you might want help debugging: Declare x number := 0; Begin Dbms_Output.Put(Sysdate || ' ' || Systimestamp); For Stuff In 1..100 Loop Dbms_Output.Put_Line('Stuff is equal to ' || Stuff || '.'); x := Stuff; End Loop; End; / With the power of remote debugging and unshared worksheets, we are going to be able to debug this ANON block! The trick – we need to create a dummy stored procedure and call it in our ANON block. Then we’re going to create an unshared worksheet and execute the script from there while the SQL Developer session is listening for remote debug connections. We step through the dummy procedure, and this takes OUT to our calling ANON block. Then we can use watches, breakpoints, and all that fancy debugger stuff! First things first, create this dummy procedure - create or replace procedure do_nothing is begin null; end; Then mouse-right-click on your Connection and select ‘Remote Debug.’ For an in-depth post on how to use the remote debugger, check out Barry’s excellent post on the subject. Open an unshared worksheet using Ctrl+Shift+N. This gives us a dedicated connection for our worksheet and any scripts or commands executed in it. Paste in your ANON block you want to debug. Add in a call to the dummy procedure above to the first line of your BEGIN block like so Begin do_nothing(); ... Then we need to setup the machine for remote debug for the session we have listening – basically we connect to SQL Developer. You can do that via a Environment Variable, or you can just add this line to your script - CALL DBMS_DEBUG_JDWP.CONNECT_TCP( 'localhost', '4000' ); Where ‘localhost’ is the machine where SQL Developer is running and ’4000′ is the port you started the debug listener on. Ok, with that all set, now just RUN the script. Once the PL/SQL call is made, the debugger will be invoked. You’ll end up in the DO_NOTHING() object. Debugging an ANON block from SQL Developer is possible! If you step out to the ANON block, we’ll end up in the script that’s used to call the procedure – which is the script you want to debug. The Anonymous Block is opened in a new SQL Dev page You can now step through the block, using watches and breakpoints as expected. I’m guessing your scripts are going to be a bit more complicated than mine, but this serves as a decent example to get you started. Here’s a screenshot of a watch and breakpoint defined in the anon block being debugged: Breakpoints, watches, and callstacks - oh my! For giggles, I created a breakpoint with a passcount of 90 for the FOR LOOP to see if it works. And of course it does You Might Also EnjoyUsing Pass Counts to Turbo Charge Your PL/SQL BreakpointsSQL Developer Tip: Viewing REFCURSOR OutputThe PL/SQL Debugger Strikes Back: Episode VDebugging PL/SQL with SQL Developer: Episode IVHow to find dependent objects in your PL/SQL Programs using SQL Developer

    Read the article

  • Tasks not appearing in Mac Outlook 2011

    - by Tama
    My current workplace uses Macs and my old workplaces used Windows. In my old workplaces I heavily used Outlook's Task functionality to manage my workload. I understand that the Task functionality in Outlook 2011 for Mac is heavily limited so I was very pleased to find this useful "how-to" on making the most of Tasks. My problem is that my tasks don't appear in the Task folder, or anywhere else for that matter. Even if I search for a the title of a task I've recently found I still can't find them. After some Googling I found this forum thread that suggests it may be a problem with the Outlook database, which points to a Microsoft KB. So I went through all of the recommended steps on rebuilding/ adding a new identity using the "Microsoft Database Utility" - the theory being that if I create a new identity I can test the task creation using a "blank slate" identity. When I change the default identity to my newly created identity using the Microsoft Database Utility (have to restart the computer) Task creation still doesn't work. Any ideas appreciated, I really miss the task functionality in Outlook 2010 for Windows.

    Read the article

  • Embedded Spotlight does not function in Outlook 2011

    - by syntaxcollector
    I have a rather strange problem. I manage a network of about 35 Mac, and we all recently switched from Mail.app to Outlook 2011 (Please don't debate this, I've already had this conversation ad nauseum) We are using network home directories (NHD) server from a Windows file server over the SMB protocol. The problem I'm having is Spotlight does not function inside of Outlook. But ONLY inside of Outlook. The global Spotlight can find all email and contacts with Outlook, but the embedded Spotlight cannot. As a test, I took one of my users and switched them from a network home directory to a portable home directory (PHD) (this means the home folder was copied to the local hard drive). This resulted in a working Spotlight within Outlook, as soon as I switched the user back to an NHD, however, it stopped working. I have already tried erasing the Spotlight index and killing the process to force re-indexing. I have exhausted all Spotlight troubleshooting, and since the global is working that is obviously not the issue. I believe it has something to do with the Spotlight plugin Microsoft wrote that is located in /Library/Spotlight. Any ideas?

    Read the article

  • Active Directory Corrupted In Windows Small Business Server 2011 - Server No Longer Domain Controller

    - by ThinkerIV
    I have a rather bad problem with my Windows SBS 2011. First of all, I'll give the background to what caused the problem. I was setting up a new small business server network. I had my job about finished. The server was working great, all the workstations had joined the domain, and I had all my applications and data moved to the server. I thought I was done. But then it happened. I tried adding one more computer to the domain, and to my dismay the computer name was set to the same name as the server. Apparently when a computer joins a domain with the same name as another machine that is already on the domain, it overrides the first one. For normal workstations, this is not a big deal, you just delete the computer from AD and rejoin the original computer to the domain. However, for a server that is the domain controller it is a whole different story. Since the server got overridden in AD, it is no longer the domain controller. The DNS service is not working and all kinds of other services are failing also. So the question is, what are my options? I am embarrassed to admit it, but since this is a new server one thing I did not have setup yet was backup. So I have no backups to work from. I am worried that things are broken enough that I might need to do a reinstall. However, I already have several days worth of configuration into this server, so I would obviously prefer if there was a fix that would prevent me from needing to do a reinstall. All the server components are there and installed correctly, but they are misconfigured (I think it is basically just Active Directory). So I have the feeling that if I did the right thing I could solve the issue without a reinstall. Is there anyway to rerun the component that installs the initial configuration to "convert" the base windows server 2008 r2 install into a SBS? In other words in the program files folder there is an application called SBSsetup.exe, is there anyway to rerun this and have it reconfigure AD, etc. to work with SBS? Any insight will be greatly appreciated. Thanks.

    Read the article

  • Cumulative Update packages for SQL Server 2008 R2 RTM & SQL Server 2008 SP1

    - by ssqa.net
    Here is the news on Cumulative Update release news on SQL Server 2008 R2 RTM & SQL Server 2008 Service Pack 1. First let us go through SQL Server 2008 R2 RTM cumulative update, release consist the only hotfixes that were released in Cumulative Update 5, 6, & 7 for SQL Server 2008 SP1. Cumulative Update 1 for SQL 2008 R2 RTM is only intended as a post-RTM rollup for Cumulative Update 5-7 for the release version of SQL Server 2008 SP1 customers who plan to upgrade to SQL Server 2008 R2 and...(read more)

    Read the article

  • What will be important in Training in 2011?

    - by anders.northeved
      Now that we have started a new year I would like to give you a list of topics I think we will be discussing in training and learning in 2011. Some of the areas we have discussed earlier will still be just as important in 2011: Time-to-knowledge Still one of the most important issues for the training department. Internal content production Related to time-to-knowledge. How do we convert internal knowledge to a format that can be used for teaching others? LMS integration How do we get our existing LMS fully integrated with our other ERP modules like HCM, Order Management, Finance, Payroll etc. Some areas have been discussed before, but we’ll focus more on these in 2011: Combining internal and external training A majority of training departments use a combination of external and internal training. Having the right mix is vital for the quality and efficiency for most training organizations. Certification More rules and regulations means managing all employee certifications is more important than ever. Evolving trends in 2011: Social Learning We have been talking about this for a long time, but 2011 will be the year where we will start using it for real (OK, I also said so last year – but this year I’m right…). Real-life use of SCORM 2004 Again a topic we have talked about for a long time, but we are now actually starting to use it to give learners a better e-learning experience. How do we engage and delight the learner? e-learning makes economical sense, it can be easy to understand, it is convenient – but how do we make it more engaging and delight our learners? How to include more types of training in LMS One of the main focus area of 2011 will be how to manage and measure mobile learning , on-the-job-training and other forms of training in the LMS. Mobile Learning With the ever growing use of smart phones mobile learning will be THE hot topic of 2011 in the training world. New topics we will begin discussing in 2011: What is beyond web 2.0 and social learning? - could it be content verification and personal accreditation? Why gaming will not be the silver bullet for all types of e-learning Many people believe gaming can be used for any kind of training, but the creation is too expensive and time consuming for most applications. Do you agree with these predictions? What are your own predictions? Let me see your comments! (photo: © Marti, photoxpress.com)

    Read the article

  • SQL Server Configuration timeouts - and a workaround [SSIS]

    - by jamiet
    Ever since I started writing SSIS packages back in 2004 I have opted to store configurations in .dtsConfig (.i.e. XML) files rather than in a SQL Server table (aka SQL Server Configurations) however recently I inherited some packages that used SQL Server Configurations and thus had to immerse myself in their murky little world. To all the people that have ever gone onto the SSIS forum and asked questions about ambiguous behaviour of SQL Server Configurations I now say this... I feel your pain! The biggest problem I have had was in dealing with the change to the order in which configurations get applied that came about in SSIS 2008. Those changes are detailed on MSDN at SSIS Package Configurations however the pertinent bits are: As the utility loads and runs the package, events occur in the following order: The dtexec utility loads the package. The utility applies the configurations that were specified in the package at design time and in the order that is specified in the package. (The one exception to this is the Parent Package Variables configurations. The utility applies these configurations only once and later in the process.) The utility then applies any options that you specified on the command line. The utility then reloads the configurations that were specified in the package at design time and in the order specified in the package. (Again, the exception to this rule is the Parent Package Variables configurations). The utility uses any command-line options that were specified to reload the configurations. Therefore, different values might be reloaded from a different location. The utility applies the Parent Package Variable configurations. The utility runs the package. To understand how these steps differ from SSIS 2005 I recommend reading Doug Laudenschlager’s blog post Understand how SSIS package configurations are applied. The very nature of SQL Server Configurations means that the Connection String for the database holding the configuration values needs to be supplied from the command-line. Typically then the call to execute your package resembles this: dtexec /FILE Package.dtsx /SET "\Package.Connections[SSISConfigurations].Properties[ConnectionString]";"\"Data Source=SomeServer;Initial Catalog=SomeDB;Integrated Security=SSPI;\"", The problem then is that, as per the steps above, the package will (1) attempt to apply all configurations using the Connection String stored in the package for the "SSISConfigurations" Connection Manager before then (2) applying the Connection String from the command-line and then (3) apply the same configurations all over again. In the packages that I inherited that first attempt to apply the configurations would timeout (not unexpected); I had 8 SQL Server Configurations in the package and thus the package was waiting for 2 minutes until all the Configurations timed out (i.e. 15seconds per Configuration) - in a package that only executes for ~8seconds when it gets to do its actual work a delay of 2minutes was simply unacceptable. We had three options in how to deal with this: Get rid of the use of SQL Server configurations and use .dtsConfig files instead Edit the packages when they get deployed Change the timeout on the "SSISConfigurations" Connection Manager #1 was my preferred choice but, for reasons I explain below*, wasn't an option in this particular instance. #2 was discounted out of hand because it negates the point of using Configurations in the first place. This left us with #3 - change the timeout on the Connection Manager. This is done by going into the properties of the Connection Manager, opening the "All" tab and changing the Connect Timeout property to some suitable value (in the screenshot below I chose 2 seconds). This change meant that the attempts to apply the SQL Server configurations timed out in 16 seconds rather than two minutes; clearly this isn't an optimum solution but its certainly better than it was. So there you have it - if you are having problems with SQL Server configuration timeouts within SSIS try changing the timeout of the Connection Manager. Better still - don't bother using SQL Server Configuration in the first place. Even better - install RC0 of SQL Server 2012 to start leveraging SSIS parameters and leave the nasty old world of configurations behind you. @Jamiet * Basically, we are leveraging a SSIS execution/logging framework in which the client had invested a lot of resources and SQL Server Configurations are an integral part of that.

    Read the article

  • It's 2011-Do You Know Where Your Children Are?

    - by andyleonard
    Introduction This is not a post about children. I was feeling plucky when I wrote this post at the end of last year. Sometimes when I feel plucky I'm inspired to create awesome blog post titles and ideas. Other times, this happens. 2011 Is Here! I was born in 1963. As I child I watched Neil Armstrong and Buzz Aldrin walk on the moon while Walter Cronkite narrated. At 11, I was fortunate enough to live next door to an engineer who taught me Motorola 6800 machine code and then BASIC . I have a long...(read more)

    Read the article

  • Non-printing characters in Word 2011 not showing even when enabled

    - by Henrik Söderlund
    I have a document I work on often, my resume. I have created a few different styles that I use and for some reason the non-printing characters have stopped showing properly. I have the option enabled (the reversed P) and the proper settings in the preferences checked. Here is a screenshot of the current view: basically, only the tab stops and the returns are showing. Upon doing an experiment by creating a new document, all characters (especially the spaces) show up nicely: I can copy this line and paste it into my resume document and it shows up there too. It seems my styles are doing something...

    Read the article

  • Excel 2011 for Mac VLOOKUP Date Issue

    - by Mitch
    I'm fairly proficient in using vlookups, but I'm having an issue vlooking up dates between two different spreadsheets. =VLOOKUP(B6,'[example.xlsx]Sheet1'!$B$1:$AA$260, 19, FALSE) My formula is retrieving a date fine, but the date is different when the cell is formatted for a date. Yet, when I change the formatting on each spreadsheet to display the date as a number, the number is the same (40115). The dates are displaying differently in each spreadsheet and I can't figure out why, they differ by about 3 years and 1 day (10/30/13 vs. 10/29/09). One was previously .xls, but I saved both a .xlsx. Thanks.

    Read the article

  • How to automatically move mail into appropriate folders - Outlook for Mac 2011

    - by user53654
    Is there anyway to have emails automatically placed into folders where the original email resides? Situation: I sort my emails into various folders based on clients. However, when a new email comes in that was a reply to a previous it goes to my inbox versus going into the client folder. I know the typical answer is "just create a rule for that client". The problem with that is my clients change often, and creating a new rule that often is infeasible. I was hoping there might be some way to have a "follow source email" option. That way, when I move email A into folder X, all subsequent replies to email A will automatically be placed in folder X. Any ideas?

    Read the article

  • Multiple Selections from Dropdown in Word 2011 Form

    - by BR Hudgens
    Anyone know if there is a way to setup a Word form to allow the user to select multiple items from a dropdown list that are then displayed? For example, I need to setup a form to identify applicability of specific procedures to various levels of our organization. Since the listing of all departments under all divisions can be pretty lengthy, I was hoping for the ability to check off a division and then have a dropdown list on the same line to select the applicable departments (so the selections are specific to that division and not all listed on the form if that division is not applicable). First, I'm not seeing how to create a dropdown that would allow multiple selections. Beyond that, I don't know if that form element alone would then display all selected items or if that would require another element of some type. Anyone have any suggestions on this? Thanks in advance!

    Read the article

  • Small Business Server 2011 and Remote access to documents

    - by Tim Long
    Assume I'm working away from the office; its a hotel computer Windows 7, Office 2010 and fast - so the best possible conditions. Using Companyweb - Every time I open a document, I have to go through the logon process - seems odd to have to do that. Is this a 'by design' feature or is something wrong with my configuration? When I do open the documents, are they being stored somewhere locally and should I be looking to delete on this computer - or are they in a temporary file?

    Read the article

  • More information on the Patch Tuesday updates for SQL Server

    - by AaronBertrand
    Last week, Microsoft released a series of patches for all supported versions of SQL Server (from SQL Server 2005 SP3 all the way to SQL Server 2008 R2). The reason for the patch against SQL Server installations is largely a client-side issue with the XML viewer application, and for SQL Server specifically, the exploit is limited to potential information disclosure. A very easy way to avoid exposure to this exploit is simply to never open a file with the .disco extension (these files are likely already...(read more)

    Read the article

  • Good SQL error handling in Strored Procedure

    - by developerit
    When writing SQL procedures, it is really important to handle errors cautiously. Having that in mind will probably save your efforts, time and money. I have been working with MS-SQL 2000 and MS-SQL 2005 (I have not got the opportunity to work with MS-SQL 2008 yet) for many years now and I want to share with you how I handle errors in T-SQL Stored Procedure. This code has been working for many years now without a hitch. N.B.: As antoher "best pratice", I suggest using only ONE level of TRY … CATCH and only ONE level of TRANSACTION encapsulation, as doing otherwise may not be 100% sure. BEGIN TRANSACTION; BEGIN TRY -- Code in transaction go here COMMIT TRANSACTION; END TRY BEGIN CATCH -- Rollback on error ROLLBACK TRANSACTION; -- Raise the error with the appropriate message and error severity DECLARE @ErrMsg nvarchar(4000), @ErrSeverity int; SELECT @ErrMsg = ERROR_MESSAGE(), @ErrSeverity = ERROR_SEVERITY(); RAISERROR(@ErrMsg, @ErrSeverity, 1); END CATCH; In conclusion, I will just mention that I have been using this code with .NET 2.0 and .NET 3.5 and it works like a charm. The .NET TDS parser throws back a SQLException which is ideal to work with.

    Read the article

  • My experience working with Teradata SQL Assistant

    - by Kevin Shyr
    Originally posted on: http://geekswithblogs.net/LifeLongTechie/archive/2014/05/28/my-experience-working-with-teradata-sql-assistant.aspx To this date, I still haven't figure out how to "toggle" between my query windows. It seems like unless I click on that "new" button on top, whatever SQL I generate from right-click just overrides the current SQL in the window. I'm probably missing a "generate new sql in new window" setting The default Teradata SQL Assistant doesn't execute just the SQL query I highlighted. There is a setting I have to change first. I'm not really happy that the SQL assistant and SQL admin are different app. Still trying to get used to the fact that I can't quickly look up a table's keys/relationships while writing query. I have to switch between windows. LOVE the execution plan / explanation. I think that part is better done than MS SQL in some ways. The error messages can be better. I feel that Teradata .NET provider sends smaller query command over than others. I don't have any hard data to support my claim. One of my query in SSRS was passing multi-valued parameters to another query, and got error "Teradata 3577 row size or sort key size overflow". The search on this error says the solution is to cast result column into smaller data type, but I found that the problem was that the parameter passed into the where clause could not be too large. I wish Teradata SQL Assistant would remember the window size I just adjusted to. Every time I execute the query, the result set, query, and exec log auto re-adjust back to the default size. In SSMS, if I adjust the result set area to be smaller, it would stay like that if I execute query in the same window.

    Read the article

  • Watch 53rd Grammy Awards 2011 Live Stream & Follow The Event On Facebook & Twitter

    - by Gopinath
    Grammy Awards is the biggest music awards ceremony that honours music genres. The 53rd Grammy Awards function will be held on February 13, 2011 at Staples Center in Los Angeles. The star studded musical event will be telecasted live on CBS TV between 8pm ET until 11:30pm ET. Behind The Scenes Live Stream On YouTube and Grammy.com Grammy, YouTube and TurboTax has teamed up to provide live stream of behind the scene coverage of key events starting at 5pm Feb 11th and runs through February 13th. Note that this stream will not broadcast the 3 hour long award presentation. Kind of bad, except the award presentation rest of the ceremony is streamed here.  The video stream is embedded below Catch live behind-the-scenes coverage of key events during GRAMMY week. Before the show, you can watch the Nominees Reception, Clive Davis Pre-GRAMMY Gala, the red carpet and the pre-telecast ceremony. During the show, go backstage to see the winners after they accept their award! You can catch the same stream on Grammys’s YouTube channel as well as on Grammy.com website Follow Grammy Awards On Facebook & Twitter The Grammy Awards has an official Facebook and Twitter account and you can follow them for all the latest information The Grammy’s Facebook Page The Grammy’s Twitter Page Grammy Live Streaming on UStream & Justin.tv Even though there is no official source that stream live of The Grammy Awards presentation ceremony, there will be plenty of unofficial sources on UStream and Justin.tv. So search UStream and Justin.tv for live streaming of The Grammy’s. TV Channels Telecasting The Grammy Awards 2011 Here is the list of TV channels in various countries offering live telecast of The Grammy Awards 2011. We keep updating this section as and when we get more information USA – CBS Television Network  – February 13, 2011 India – VH1 TV – February 14  2011 , 6:30 AM United Kingdom – ITV2  – 16 February 2011, 10:00PM – 12:00AM This article titled,Watch 53rd Grammy Awards 2011 Live Stream & Follow The Event On Facebook & Twitter, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Slow (to none) performance on SQL 2005 after attaching SQL 2000 database

    - by ploft
    Issue: Using the detach/attach SQL database from a SQL 2000 SP4 instance to a much beefier SQL 2005 SP2 server. Run reindex, reorganize and update statistics a couple of times, but without any success. Queries on SQL 2000 took about 1-2 sec. to complete, now the same queries take 2-3 min on the SQL 2005 (and even 2008 - tested it there also). Have looked at the execution plans and the overall percent matches or are alike on each server.

    Read the article

  • Microsoft guarantees the performance of SQL Server

    - by simonsabin
    I have recently been informed that Microsoft will be guaranteeing the performance of SQL Server. Yes thats right Microsoft will guarantee that you will get better performance out of SQL Server that any other competitor system. However on the flip side there are also saying that end users also have to guarantee the performance of SQL Server if they want to use the next release of SQL Server targeted for 2011 or 2012. It appears that a recent recruit Mark Smith from Newcastle, England will be heading a new team that will be making sure you are running SQL Server on adequate hardware and making sure you are developing your applications according to best practices. The Performance Enforcement Team (SQLPET) will be a global group headed by mark that will oversee two other groups the existing Customer Advisory Team (SQLCAT) and another new team the Design and Operation Group (SQLDOG). Mark informed me that the team was originally thought out during Yukon and was going to be an independent body that went round to customers making sure they didn’t suffer performance problems. However it was felt that they needed to wait a few releases until SQL Server was really there. The original Yukon Independent Performance Enhancement Team (YIPET) has now become the SQL Performance Enforcement Team (SQLPET). When challenged about the change from enhancement to enforcement Mark was unwilling to comment. An anonymous source suggested that "..Microsoft is sick of the bad press SQL Server gets for performance when the performance problems are normally down to people developing applications badly and using inadequate hardware..." Its true that it is very easy to install and run SQL, unlike other RDMS systems and the flip side is that its also easy to get into performance problems due to under specified hardware and bad design. Its not yet confirmed if this enforcement will apply to all SKUs or just the high end ones. I would personally welcome some level of architectural and hardware advice service that clients would be able to turn to, in order to justify getting the appropriate hardware at the start of a project and not 1 year in when its often too late.

    Read the article

  • My “SQL Server” Goals for 2011

    - by NeilHambly
    Having Read a few blogs on various SQL people setting their "Goals" for the Year ahead, and having some clearly defined SQL based goals already in mind. I have decided to share these for others to see and ask me from time-time how I'm progressing with them, so although no particular priorities in mind here are my chosen goals for 2011 SQL conferences & Training Events · SQLCruise (June 2011) Alaska - I'm booked on this one already!! · SQL Master Week # 1 ( April/May 2011) Master...(read more)

    Read the article

  • Multiple vulnerabilities in Mozilla Firefox

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2011-2372 Permissions, Privileges, and Access Controls vulnerability 3.5 Firefox web browser Solaris 11 11/11 SRU 3 Solaris 10 Contact Support CVE-2011-2995 Denial of Service (DoS) vulnerability 10.0 CVE-2011-2997 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3000 Improper Control of Generation of Code ('Code Injection') vulnerability 4.3 CVE-2011-3001 Permissions, Privileges, and Access Controls vulnerability 4.3 CVE-2011-3002 Denial of Service (DoS) vulnerability 9.3 CVE-2011-3003 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3004 Improper Input Validation vulnerability 4.3 CVE-2011-3005 Denial of Service (DoS) vulnerability 9.3 CVE-2011-3232 Improper Control of Generation of Code ('Code Injection') vulnerability 9.3 CVE-2011-3648 Improper Neutralization of Input During Web Page Generation ('Cross-site Scripting') vulnerability 4.3 CVE-2011-3650 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 9.3 CVE-2011-3651 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3652 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3654 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3655 Improper Control of Generation of Code ('Code Injection') vulnerability 9.3 This notification describes vulnerabilities fixed in third-party components that are included in Sun's product distribution.Information about vulnerabilities affecting Oracle Sun products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • My .NET Technology picks for 2011

    - by shiju
    My Technology predictions for 2011 Cloud computing and Mobile application development will be the hottest trends for 2011. I hope that Windows Azure will be very hot in year 2011 and lot of cloud computing adoption will be happen with Windows Azure on 2011. Web application scalability will be the big challenge for Architects in the next year and architecture approaches like CQRS will get some attention on next year. Architects will look on different options for web application scalability and adoption of NoSQL and Document databases will be more in the year 2011. The following are the my technology picks for .Net stack Windows Azure Windows Azure will be one of the hottest technologies of 2011. Adoption of Cloud and Windows Azure will get big attention on next year. The Windows Azure platform is a flexible cloud–computing platform that lets you focus on solving business problems and addressing customer needs. No need to invest upfront on expensive infrastructure. Pay only for what you use, scale up when you need capacity and pull it back when you don’t. We handle all the patches and maintenance — all in a secure environment with over 99.9% uptime. Silverlight 5 Silverlight is becoming a common technology for variety of development platforms. You can develop Silverlight applications for web, desktop and windows phone. The new Silverlight 5 beta will be available during the starting quarter of the next year with new capabilities and lot of new features. Silverlight 5 will be powerful development platform for both web-based business apps and rich media solutions. We can expect final version of Silverlight 5 on end of 2011. Windows Phone 7 Development Tools Mobile application development will be very hot in year 2011 and Windows Phone 7 will be one of the hottest technologies of next year. You can get introduction on Windows Phone 7 Development Tools from somasegar’s blog post and MSDN documentation available from here. EF Code First I am a big fan of Entity Framework’s Code First approach and hope that Code First approach will attract more people onto Entity Framework 4. EF Code First lets you focus on domain model which will enable Domain-Driven Development for applications. I hope that DDD fans will love the EF Code First approach. The Entity Framework 4 now supports three types of approaches and these will attract different types of developer audience. ASP.NET MVC 3 The ASP.NET MVC 3 will be the hottest technology of Microsoft web stack on the next year. ASP.NET developers will widely move to the ASP.NET MVC Framework from their WebForms development. The new Razor view engine is great and it will increase the adoption of ASP.NET MVC 3. Razor the will improve the productivity when working with ASP.NET MVC 3 Views. You can build great web applications using ASP.NET MVC 3 and jQuery with better maintainability, generation of clean HTML and even better performance. In my opinion, the best technology stack for web development is ASP.NET MVC 3 and Entity Framework 4 Code First as ORM. On the next year, you can expect more articles from my blog on ASP.NET MVC 3 and Entity Framework 4 Code First. RavenDB NoSQL and Document databases will get more attention on the coming year and RavenDB will be the most notable document database in the .NET stack. RavenDB is an Open Source (with a commercial option) document database for the .NET/Windows platform developed by Ayende Rahien. RavenDB is .NET focused document database which comes with a fully functional .NET client API and supports LINQ. I have written few articles on RavenDB and you can read it from here. Managed Extensibility Framework (MEF) Many people didn't realized the power of MEF. The MEF lets you create extensible applications and provides a great solution for the runtime extensibility problem. I hope that .NET developers will more adopt the MEF on the next year for their .NET applications. You can get an excellent introduction on MEF from Anoop Madhusudanan’s blog post MEF or Managed Extensibility Framework – Creating a Zoo and Animals

    Read the article

  • Microsoft Business Intelligence Seminar 2011

    - by DavidWimbush
    I was lucky enough to attend the maiden presentation of this at Microsoft Reading yesterday. It was pretty gripping stuff not only because of what was said but also because of what could only be hinted at. Here's what I took away from the day. (Disclaimer: I'm not a BI guru, just a reasonably experienced BI developer, so I may have misunderstood or misinterpreted a few things. Particularly when so much of the talk was about the vision and subtle hints of what is coming. Please comment if you think I've got anything wrong. I'm also not going to even try to cover Master Data Services as I struggled to imagine how you would actually use it.) I was a bit worried when I learned that the whole day was going to be presented by one guy but Rafal Lukawiecki is a very engaging speaker. He's going to be presenting this about 20 times around the world over the coming months. If you get a chance to hear him speak, I say go for it. No doubt some of the hints will become clearer as Denali gets closer to RTM. Firstly, things are definitely happening in the SQL Server Reporting and BI world. Traditionally IT would build a data warehouse, then cubes on top of that, and then publish them in a structured and controlled way. But, just as with many IT projects in general, by the time it's finished the business has moved on and the system no longer meets their requirements. This not sustainable and something more agile is needed but there has to be some control. Apparently we're going to be hearing the catchphrase 'Balancing agility with control' a lot. More users want more access to more data. Can they define what they want? Of course not, but they'll recognise it when they see it. It's estimated that only 28% of potential BI users have meaningful access to the data they need, so there is a real pent-up demand. The answer looks like: give them some self-service tools so they can experiment and see what works, and then IT can help to support the results. It's estimated that 32% of Excel users are comfortable with its analysis tools such as pivot tables. It's the power user's preferred tool. Why fight it? That's why PowerPivot is an Excel add-in and that's why they released a Data Mining add-in for it as well. It does appear that the strategy is going to be to use Reporting Services (in SharePoint mode), PowerPivot, and possibly something new (smiles and hints but no details) to create reports and explore data. Everything will be published and managed in SharePoint which gives users the ability to mash-up, share and socialise what they've found out. SharePoint also gives IT tools to understand what people are looking at and where to concentrate effort. If PowerPivot report X becomes widely used, it's time to check that it shows what they think it does and perhaps get it a bit more under central control. There was more SharePoint detail that went slightly over my head regarding where Excel Services and Excel Web Application fit in, the differences between them, and the suggestion that it is likely they will one day become one (but not in the immediate future). That basic pattern is set to be expanded upon by further exploiting Vertipaq (the columnar indexing engine that enables PowerPivot to store and process a lot of data fast and in a small memory footprint) to provide scalability 'from the desktop to the data centre', and some yet to be detailed advances in 'frictionless deployment' (part of which is about making the difference between local and the cloud pretty much irrelevant). Excel looks like becoming Microsoft's primary BI client. It already has: the ability to consume cubes strong visualisation tools slicers (which are part of Excel not PowerPivot) a data mining add-in PowerPivot A major hurdle for self-service BI is presenting the data in a consumable format. You can't just give users PowerPivot and a server with a copy of the OLTP database(s). Building cubes is labour intensive and doesn't always give the user what they need. This is where the BI Semantic Model (BISM) comes in. I gather it's a layer of metadata you define that can combine multiple data sources (and types of data source) into a clear 'interface' that users can work with. It comes with a new query language called DAX. SSAS cubes are unlikely to go away overnight because, with their pre-calculated results, they are still the most efficient way to work with really big data sets. A few other random titbits that came up: Reporting Services is going to get some good new stuff in Denali. Keep an eye on www.projectbotticelli.com for the slides. You can also view last year's seminar sessions which covered a lot of the same ground as far as the overall strategy is concerned. They plan to add more material as Denali's features are publicly exposed. Check out the PASS keynote address for a showing of Yahoo's SQL BI servers. Apparently they wheeled the rack out on stage still plugged in and running! Check out the Excel 2010 Data Mining Add-Ins. 32 bit only at present but 64 bit is on the way. There are lots of data sets, many of them free, at the Windows Azure Marketplace Data Market (where you can also get ESRI shape files). If you haven't already seen it, have a look at the Silverlight Pivot Viewer (http://weblogs.asp.net/scottgu/archive/2010/06/29/silverlight-pivotviewer-now-available.aspx). The Bing Maps Data Connector is worth a look if you're into spatial stuff (http://www.bing.com/community/site_blogs/b/maps/archive/2010/07/13/data-connector-sql-server-2008-spatial-amp-bing-maps.aspx).  

    Read the article

  • SQL Server Connections Fall 2011 - Demos

    - by Adam Machanic
    Today is the last day of the annual SQL Server Connections show in Vegas, and I've just completed my third and final talk. (Now off to find a frosty beverage or two.) This year I did three sessions: SQL302: Parallelism and Performance: Are You Getting Full Return on Your CPU Investment? Over the past five years, multi-core processors have made the jump from semi-obscure to commonplace in the data center. While servers with 16, 32, or even 64 cores were once an out-of-reach choice for all except the...(read more)

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >