Search Results

Search found 4621 results on 185 pages for 'scott lock'.

Page 128/185 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Can't delete profile

    - by generic_noob
    Hi All, I have a client machine (XP SP3) which used to be connected to a windows 2003 domain, but the server has since gone down due to a hard drive failure, however the profiles still remain in 'documents and settings'. I have a local admin account on the same machine (in the administrator's group), except when i try to remove the profiles manually from System Properties - Advanced - User Profiles - Settings, the delete button is disabled. Also, Windows prevents me from deleting or renaming the user's profile folder as well, due to a lock with ntuser.dat Any thoughts on this would be greatly appreciated, Cheers!

    Read the article

  • Windows 8: 100% disk active time, no actual data transferred

    - by fingerbangpalateclick
    Occasionally, like several times an hour, my hard drive will appear to lock up: Task Manager will show 100% active time with read and write speeds of 0. I can still switch between open windows, but anything that requires a disk access will stall for around a minute until the hard disk starts working properly again. It happens at apparently random intervals, and only happens in Windows 8. Not 7, nor Linux. It is probably not a problem with the disk itself: This is a relatively new hard drive, and S.M.A.R.T. is showing no errors. Only happens in Windows 8: not any other OS that has used the same partition, or different partitions. So, what is going on? How can I fix this? Note: this is a different problem then this one: Extremely high disk activity without any real usage My task manager would look similar, but Average Response Time, Read Speed, and Write Speed would all be 0.

    Read the article

  • RDC stops working after a period of time.

    - by xjerx
    I have a workstation with RDC configured for the employee. When they leave at the end of their day they lock the pc (windows key + l). They go home connect to our VPN and log back in. Everything works fine. The following morning they will attempt to log in before they return to the office. The computer does not respond to the RDC request. I've found that it becomes completely inactive to any ICMP requests. Once the user reboots the computer everything works fine again. I'm going to turn off RDC, reboot, turn RDC back on and reboot again to see if it fixes the problem. Until then does anyone have any other ideas?

    Read the article

  • Microsoft Enterprise Library Caching Application Block not thread safe?!

    - by AlanR
    Good aftenoon, I created a super simple console app to test out the Enterprise Library Caching Application Block, and the behavior is blaffling. I'm hoping I screwed something that's easy to fix in the setup. Have each item expire after 5 seconds for testing purposes. Basic setup -- "every second pick a number between 0 and 2. if the cache doesn't already have it, put it in there -- otherwise just grab it from the cache. Do this inside a LOCK statement to ensure thread safety. APP.CONFIG: <configuration> <configSections> <section name="cachingConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Caching.Configuration.CacheManagerSettings, Microsoft.Practices.EnterpriseLibrary.Caching, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> </configSections> <cachingConfiguration defaultCacheManager="Cache Manager"> <cacheManagers> <add expirationPollFrequencyInSeconds="1" maximumElementsInCacheBeforeScavenging="1000" numberToRemoveWhenScavenging="10" backingStoreName="Null Storage" type="Microsoft.Practices.EnterpriseLibrary.Caching.CacheManager, Microsoft.Practices.EnterpriseLibrary.Caching, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="Cache Manager" /> </cacheManagers> <backingStores> <add encryptionProviderName="" type="Microsoft.Practices.EnterpriseLibrary.Caching.BackingStoreImplementations.NullBackingStore, Microsoft.Practices.EnterpriseLibrary.Caching, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="Null Storage" /> </backingStores> </cachingConfiguration> </configuration> C#: using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.Practices.EnterpriseLibrary.Common; using Microsoft.Practices.EnterpriseLibrary.Caching; using Microsoft.Practices.EnterpriseLibrary.Caching.Expirations; namespace ConsoleApplication1 { class Program { public static ICacheManager cache = CacheFactory.GetCacheManager("Cache Manager"); static void Main(string[] args) { while (true) { System.Threading.Thread.Sleep(1000); // sleep for one second. var key = new Random().Next(3).ToString(); string value; lock (cache) { if (!cache.Contains(key)) { cache.Add(key, key, CacheItemPriority.Normal, null, new SlidingTime(TimeSpan.FromSeconds(5))); } value = (string)cache.GetData(key); } Console.WriteLine("{0} --> '{1}'", key, value); //if (null == value) throw new Exception(); } } } } OUPUT -- How can I prevent the cache to returning nulls? 2 --> '2' 1 --> '1' 2 --> '2' 0 --> '0' 2 --> '2' 0 --> '0' 1 --> '' 0 --> '0' 1 --> '1' 2 --> '' 0 --> '0' 2 --> '2' 0 --> '0' 1 --> '' 2 --> '2' 1 --> '1' Press any key to continue . . . Thanks in advance, -Alan.

    Read the article

  • Is LINQ to SQL deprecated?

    - by Mayo
    Back in late 2008 there was alot of debate about the future of LINQ to SQL. Many suggested that Microsoft's investments in the Entity Framework in .NET 4.0 were a sign that LINQ to SQL had no future. I figured I'd wait before making my own decision since folks were not in agreement. Fast-forward 18 months and I've got vendors providing solutions that rely on LINQ to SQL and I have personally given it a try and really enjoyed working with it. I figured it was here to stay. But I'm reading a new book (C# 4.0 How-To by Ben Watson) and in chapter 21 (LINQ), he suggests that it "has been more or less deprecated by Microsoft" and suggests using LINQ to Entity Framework. My question to you is whether or not LINQ to SQL is officially deprecated and/or if authoritative entities (Microsoft, Scott Gu, etc.) officially suggest using LINQ to Entities instead of LINQ to SQL.

    Read the article

  • Custom app_offline.htm file during publish

    - by Charlino
    When I publish my ASP.NET MVC application it generates a app_offline.htm file to take the site offline while it updates the website and then deletes the file once the publish is successful. This is cool and I really like the idea, but I want to create my own custom app_offline.htm file that the publish action is aware of and put it somewhere where it doesn't effect my development site - i.e. it doesn't sit in the root of my development site rendering it offline all the time. TIA, Charles EDIT: From the comments on Scott Gu's post about app_offline.htm, it seems that customization of the app_offline.htm file wasn't possible with VS 2005 - has this changed with VS 2008 and now VS 2010?

    Read the article

  • Config transformations and “TransformXml task failed” error message

    - by Troy Hunt
    I’ve just enabled config transformations on a .NET 3.5 project in VS2010 RC after watching Scott Hanselman’s video on web deployment. Unfortunately every time I go to publish I now get the following error: The "TransformXml" task failed unexpectedly. System.UriFormatException: Invalid URI: The URI is empty. at System.Uri.CreateThis(String uri, Boolean dontEscape, UriKind uriKind) at System.Uri..ctor(String uriString) at Microsoft.Web.Publishing.Tasks.TransformXml.Execute() at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute() at Microsoft.Build.BackEnd.TaskBuilder.ExecuteInstantiatedTask(ITaskExecutionHost taskExecutionHost, TaskLoggingContext taskLoggingContext, TaskHost taskHost, ItemBucket bucket, TaskExecutionMode howToExecuteTask, Boolean& taskResult) If I take a brand new VS2010 web application which already has the config transformations by default I don’t have a problem so I suspect my issue is project related. Has anyone come across this before or have any ideas on a fix?

    Read the article

  • Best Non-Microsoft/.NET Development Blogs and Podcasts

    - by Mark Lubin
    I really don't want to rile anyone's feathers and I really enjoy blogs like Joel, Coding Horror, Scott Hanselman, etc. However, what are some good blogs/podcasts that focus on other platforms besides Microsoft based ones or maybe even no particular platform at? In particular I am interested in areas like Python, Ruby, Java, C, but any area besides Microsoft fits the bill. There may be other similar questions to this but I think this one is unique since we might get some answers we wouldn't see otherwise, that is, everyone knows Coding Horror is great. Please try to refrain from religious debates; the content of the blog is more important than the platform they are talking about, I just want to get an idea of what is out there and worth the time.

    Read the article

  • Urlrewriting.net pages are not causing postbacks

    - by Nick
    I'm using webforms with UrlRewriting.Net to rewrite pages, e.g. http://www.example.com/stuff.aspx?c=30 becomes http://www.example.com/stuff/30-this-stuff.aspx. It works in so far as the correct content is loading; however, none of the postbacks are working (mostly buttons on the page). If I set up a breakpoint on Page_Load, I see that IsPostBack is always false. Any ideas on how to fix this? Right now I'm just on Visual Studio 2008. EDIT: I have since switched to UrlRewriter.Net, which worked after a few tweaks (see Scott Gu's article). Besides here, I have posted my original problem to the developer's forum: if I ever get an answer, I'll post it here (unless else posts it here first).

    Read the article

  • Using GIT Smart HTTP via IIS

    - by Andrew Matthews
    I recently read Scott Chacon's post "Smart HTTP Transport", and I was hoping that it might have become possible via IIS (windows 7) since that post was written. I haven't been able to find anything showing how it can be done, and Apache is not an option in my IIS 7 based environment. So, I'm at a loss (git daemon was foiled for me by a combination of AVG anti-virus and AD). I want to provide LDAP authenticated read/write access for selected users. So this question seems not to be relevant. Do you know of a way to provide access to GIT via IIS?

    Read the article

  • SQL Server Gender Swap Query

    - by Sanju
    I have a Table having the following column: ID, Name, Designation, Salary, Contact, Address, Gender Accidentally for all male Gender i have entered 'Female' and similarly for all Female gender i have entered 'Male'. For Exmaple 0023 Scott Developer 15000 4569865 Cliff_Eddington,USA Female I the above line There should be Male instead of female. And there are all Male whose gender has been updated as Female. Same case for all Female Gender has been updated as Male. Is there any Single query through which i can change all the Rows whose Gender is Male Change it to Female and All the Rows whose Gender is Female Change it to Male.

    Read the article

  • Best Practices - Data Annotations vs OnChanging in Entity Framework 4

    - by jptacek
    I was wondering what the general recommendation is for Entity Framework in terms of data validation. I am relatively new to EF, but it appears there are two main approaches to data validation. The first is to create a partial class for the model, and then perform data validations and update a rule violation collection of some sort. This is outlined at http://msdn.microsoft.com/en-us/library/cc716747.aspx The other is to use data annotations and then have the annotations perform data validation. Scott Guthrie explains this on his blog at http://weblogs.asp.net/scottgu/archive/2010/01/15/asp-net-mvc-2-model-validation.aspx. I was wondering what the benefits are of one over the other. It seems the data annotations would be the preferred mechanism, especially as you move to RIA Services, but I want to ensure I am not missing something. Of course, nothing precludes using both of them together. Thanks John

    Read the article

  • What initial modelling/design activities on Agile Projects do you do??

    - by dalton
    When developing an application using agile techniques, what if any initial modelling/architecture activities do you do, and how do you capture that knowledge?? I'm not after a bullet list about XP, Scrum, Crystal, DSDM..etc as I'm familiar with the methodologies. But what do you do above and beyond the guidance given by these. I find I work best by thinking the system through first, but also like the benefits of timeboxing, story cards, pairing, tdd. The closest thing I've seen so far is Scott Ambler's Initial Architecture Modelling, but was wondering what alternatives are used out there?

    Read the article

  • ASP.NET MVC Validation - localisation of the error string

    - by gmang
    I followed the techique ASP.NET MVC 2: Model Validation from Scott Gu. (http://weblogs.asp.net/scottgu/archive/2010/01/15/asp-net-mvc-2-model-validation.aspx). However I am building a localised web site. How can I localized the error string? I tried the following by replacing the following: [RegularExpression(@"\d{4}",ErrorMessage="Must be a 4 digit year")] public Nullable YearOfWork { get; set; } With the following: [RegularExpression(@"\d{4}",ErrorMessage=Resources.SharedStrings.search_error1)] public Nullable YearOfWork { get; set; } but I get a complilation error: An attribute argument must be a constant expression, typeof expression or array creation expression of an attribute parameter type Please help!

    Read the article

  • SqlBulkCopy is slow, doesn't utilize full network speed

    - by Alex
    Hi, for that past couple of weeks I have been creating generic script that is able to copy databases. The goal is to be able to specify any database on some server and copy it to some other location, and it should only copy the specified content. The exact content to be copied over is specified in a configuration file. This script is going to be used on some 10 different databases and run weekly. And in the end we are copying only about 3%-20% of databases which are as large as 500GB. I have been using the SMO assemblies to achieve this. This is my first time working with SMO and it took a while to create generic way to copy the schema objects, filegroups ...etc. (Actually helped find some bad stored procs). Overall I have a working script which is lacking on performance (and at times times out) and was hoping you guys would be able to help. When executing the WriteToServer command to copy large amount of data ( 6GB) it reaches my timeout period of 1hr. Here is the core code for copying table data. The script is written in PowerShell. $query = ("SELECT * FROM $selectedTable " + $global:selectiveTables.Get_Item($selectedTable)).Trim() Write-LogOutput "Copying $selectedTable : '$query'" $cmd = New-Object Data.SqlClient.SqlCommand -argumentList $query, $source $cmd.CommandTimeout = 120; $bulkData = ([Data.SqlClient.SqlBulkCopy]$destination) $bulkData.DestinationTableName = $selectedTable; $bulkData.BulkCopyTimeout = $global:tableCopyDataTimeout # = 3600 $reader = $cmd.ExecuteReader(); $bulkData.WriteToServer($reader); # Takes forever here on large tables The source and target databases are located on different servers so I kept track of the network speed as well. The network utilization never went over 1% which was quite surprising to me. But when I just transfer some large files between the servers, the network utilization spikes up to 10%. I have tried setting the $bulkData.BatchSize to 5000 but nothing really changed. Increasing the BulkCopyTimeout to an even greater amount would only solve the timeout. I really would like to know why the network is not being used fully. Anyone else had this problem? Any suggestions on networking or bulk copy will be appreciated. And please let me know if you need more information. Thanks. UPDATE I have tweaked several options that increase the performance of SqlBulkCopy, such as setting the transaction logging to simple and providing a table lock to SqlBulkCopy instead of the default row lock. Also some tables are better optimized for certain batch sizes. Overall, the duration of the copy was decreased by some 15%. And what we will do is execute the copy of each database simultaneously on different servers. But I am still having a timeout issue when copying one of the databases. When copying one of the larger databases, there is a table for which I consistently get the following exception: System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. It is thrown about 16 after it starts copying the table which is no where near my BulkCopyTimeout. Even though I get the exception that table is fully copied in the end. Also, if I truncate that table and restart my process for that table only, the tables is copied over without any issues. But going through the process of copying that entire database fails always for that one table. I have tried executing the entire process and reseting the connection before copying that faulty table, but it still errored out. My SqlBulkCopy and Reader are closed after each table. Any suggestions as to what else could be causing the script to fail at the point each time?

    Read the article

  • Where can I find the Visual Studio 2010 RTM release notes?

    - by Lernkurve
    Question Where can I find a list of changes introduced to Visual Studio 2010 Ultimate RTM that were added since Visual Studio 2010 Ultimate RC? In fact, I'm only interested in changes related to MS Test Manager 2010 and Coded UI Tests. Where I have looked so far I have searched the Internet, looked for a readme.txt in the installation folder, looked into the Visual Studio help (F1) and browsed the "What's new in Visual Studio 2010" section on MSDN. No luck. Found Scott Guthrie's blog post Visual Studio 2010 and .NET 4 Released, but that's not exactly what I am looking for. It's not a changelog since VS2010RC. I suppose there is no such file because they made too many changes to document and hand out to end users. But if there was, I'd be glad if someone could point me to it. Thanks.

    Read the article

  • IE attachEvent on object tag causes memory corruption

    - by larswa
    I've an ActiveX Control within an embedded IE8 HTML page that has the following event MessageReceived([in] BSTR srcWindowId, [in] BSTR json). On Windows the event is registered with OCX.attachEvent("MessageReceived", onMessageReceivedFunc). Following code fires the event in the HTML page. HRESULT Fire_MessageReceived(BSTR id, BSTR json) { CComVariant varResult; T* pT = static_cast<T*>(this); int nConnectionIndex; CComVariant* pvars = new CComVariant[2]; int nConnections = m_vec.GetSize(); for (nConnectionIndex = 0; nConnectionIndex < nConnections; nConnectionIndex++) { pT->Lock(); CComPtr<IUnknown> sp = m_vec.GetAt(nConnectionIndex); pT->Unlock(); IDispatch* pDispatch = reinterpret_cast<IDispatch*>(sp.p); if (pDispatch != NULL) { VariantClear(&varResult); pvars[1] = id; pvars[0] = json; DISPPARAMS disp = { pvars, NULL, 2, 0 }; pDispatch->Invoke(0x1, IID_NULL, LOCALE_USER_DEFAULT, DISPATCH_METHOD, &disp, &varResult, NULL, NULL); } } delete[] pvars; // -> Memory Corruption here! return varResult.scode; } After I enabled gflags.exe with application verifier, the following strange behaviour occur: After Invoke() that is executing the JavaScript callback, the BSTR from pvars[1] is copied to pvars[0] for some unknown reason!? The delete[] of pvars causes a double free of the same string then which ends in a heap corruption. Does anybody has an idea whats going on here? Is this a IE bug or is there a trick within the OCX Implementation that I'm missing? If I use the tag like: <script for="OCX" event="MessageReceived(id, json)" language="JavaScript" type="text/javascript"> window.onMessageReceivedFunc(windowId, json); </script> ... the strange copy operation does not occur. The following code also seem to be ok due to the fact that the caller of Fire_MessageReceived() is responsible for freeing the BSTRs. HRESULT Fire_MessageReceived(BSTR srcWindowId, BSTR json) { CComVariant varResult; T* pT = static_cast<T*>(this); int nConnectionIndex; VARIANT pvars[2]; int nConnections = m_vec.GetSize(); for (nConnectionIndex = 0; nConnectionIndex < nConnections; nConnectionIndex++) { pT->Lock(); CComPtr<IUnknown> sp = m_vec.GetAt(nConnectionIndex); pT->Unlock(); IDispatch* pDispatch = reinterpret_cast<IDispatch*>(sp.p); if (pDispatch != NULL) { VariantClear(&varResult); pvars[1].vt = VT_BSTR; pvars[1].bstrVal = srcWindowId; pvars[0].vt = VT_BSTR; pvars[0].bstrVal = json; DISPPARAMS disp = { pvars, NULL, 2, 0 }; pDispatch->Invoke(0x1, IID_NULL, LOCALE_USER_DEFAULT, DISPATCH_METHOD, &disp, &varResult, NULL, NULL); } } delete[] pvars; return varResult.scode; } Thanks!

    Read the article

  • WCF Self-hosted service, client clean-up on service stop

    - by Sentax
    Hi everyone, I'm curious to know how I would go about setting up my service to stop cleanly on the server the service will be installed on. For example when I have many clients connecting and doing operations every minute and I want to shut-down the service for maintenance, how can I do this in the "OnStop" event of the service to then let the main service host to deny any new client connections and let the current connections finish before it actually shuts down its services to the client, this will ensure data isn't corrupted on the server as the server shuts down. Right now I'm not setup as a singleton because I need scalability in the service. So I would have to somehow get my service host to do this independently of knowing how many instances are created of the service class. I hope I explain myself good enough, if not, let me know, I'll try to explain better. Thanks, Scott

    Read the article

  • Authentication on odata service

    - by Toad
    I want to add some authentication to my odata service. Depending on the user calling i want to: filter rows and/or remove columns. I read in scott hanselmans fine blogpost on odata ( http://www.hanselman.com/blog/CreatingAnODataAPIForStackOverflowIncludingXMLAndJSONIn30Minutes.aspx )that it is possible to intercept the incoming queries. If this works i could add some extra filtering. How would this intercepting and altering queries work exactly? I can not find any examples of where and how to do this. (i'm using entitie framework and wcf dataservices (just like scotts example blog)

    Read the article

  • unsigned char* buffer (FreeType2 Bitmap) to System::Drawing::Bitmap.

    - by Dennis Roche
    Hi, I'm trying to convert a FreeType2 bitmap to a System::Drawing::Bitmap in C++/CLI. FT_Bitmap has a unsigned char* buffer that contains the data to write. I have got somewhat working save it disk as a *.tga, but when saving as *.bmp it renders incorrectly. I believe that the size of byte[] is incorrect and that my data is truncated. Any hints/tips/ideas on what is going on here would be greatly appreciated. Links to articles explaining byte layout and pixel formats etc. would be helpful. Thanks!! C++/CLI code. FT_Bitmap *bitmap = &face->glyph->bitmap; int width = (face->bitmap->metrics.width / 64); int height = (face->bitmap->metrics.height / 64); // must be aligned on a 32 bit boundary or 4 bytes int depth = 8; int stride = ((width * depth + 31) & ~31) >> 3; int bytes = (int)(stride * height); // as *.tga void *buffer = bytes ? malloc(bytes) : NULL; if (buffer) { memset(buffer, 0, bytes); for (int i = 0; i < glyph->rows; ++i) memcpy((char *)buffer + (i * width), glyph->buffer + (i * glyph->pitch), glyph->pitch); WriteTGA("Test.tga", buffer, width, height); } // as *.bmp array<Byte>^ values = gcnew array<Byte>(bytes); Marshal::Copy((IntPtr)glyph->buffer, values, 0, bytes); Bitmap^ systemBitmap = gcnew Bitmap(width, height, PixelFormat::Format24bppRgb); // create bitmap data, lock pixels to be written. BitmapData^ bitmapData = systemBitmap->LockBits(Rectangle(0, 0, width, height), ImageLockMode::WriteOnly, bitmap->PixelFormat); Marshal::Copy(values, 0, bitmapData->Scan0, bytes); systemBitmap->UnlockBits(bitmapData); systemBitmap->Save("Test.bmp"); Reference, FT_Bitmap typedef struct FT_Bitmap_ { int rows; int width; int pitch; unsigned char* buffer; short num_grays; char pixel_mode; char palette_mode; void* palette; } FT_Bitmap; Reference, WriteTGA bool WriteTGA(const char *filename, void *pxl, uint16 width, uint16 height) { FILE *fp = NULL; fopen_s(&fp, filename, "wb"); if (fp) { TGAHeader header; memset(&header, 0, sizeof(TGAHeader)); header.imageType = 3; header.width = width; header.height = height; header.depth = 8; header.descriptor = 0x20; fwrite(&header, sizeof(header), 1, fp); fwrite(pxl, sizeof(uint8) * width * height, 1, fp); fclose(fp); return true; } return false; } Update FT_Bitmap *bitmap = &face->glyph->bitmap; // stride must be aligned on a 32 bit boundary or 4 bytes int depth = 8; int stride = ((width * depth + 31) & ~31) >> 3; int bytes = (int)(stride * height); target = gcnew Bitmap(width, height, PixelFormat::Format8bppIndexed); // create bitmap data, lock pixels to be written. BitmapData^ bitmapData = target->LockBits(Rectangle(0, 0, width, height), ImageLockMode::WriteOnly, target->PixelFormat); array<Byte>^ values = gcnew array<Byte>(bytes); Marshal::Copy((IntPtr)bitmap->buffer, values, 0, bytes); Marshal::Copy(values, 0, bitmapData->Scan0, bytes); target->UnlockBits(bitmapData);

    Read the article

  • Simple Linq Dynamic Query question

    - by Dr. Zim
    In Linq Dynamic Query, Scott Guthrie shows an example Linq query: var query = db.Customers. Where("City == @0 and Orders.Count >= @1", "London", 10). OrderBy("CompanyName"). Select("new( CompanyName as Name, Phone)"); Notice the projection new( CompanyName as Name, Phone). If I have a class like this: public class CompanyContact { public string Name {get;set;} public string Phone {get;set;} } How could I essentially "cast" his result using the CompanyContact data type without doing a foreach on each record and dumping it in to a different data structure? To my knowledge the only .Select available is the Dymanic Query version which only takes a string and parameter list.

    Read the article

  • No_data_found exception is progating in outer block also?

    - by Vineet
    In my code i am entering the salary which is not available in employees table and then again inserting duplicate employee_id in primary key column of employee table in exception block where i am handling no data found exception but i do not why 'No data found exception in the end also? OUTPUT coming: Enter some other sal ORA-01400: cannot insert NULL into ("SCOTT"."EMPLOYEES"."LAST_NAME") ORA-01403: no data found --This should not come according to logic This is the code DECLARE v_sal number:=&p_sal; v_num number; BEGIN BEGIN select salary INTO v_num from employees where salary=v_sal; EXCEPTION WHEN no_data_found THEN DBMS_OUTPUT.PUT_LINE('Enter some other sal'); INSERT INTO employees (employee_id)values(100) ; END; EXCEPTION WHEN OTHERS THEN DBMS_OUTPUT.PUT_LINE(sqlerrm); END;

    Read the article

  • Using jQuery for client side validation in MVC2 RTM

    - by tigermain
    Scott Gu's tutorial on Model validation gets us all set up with the MS client side validation using the following scripts: <script src="../../Scripts/jquery.validate.min.js" type="text/javascript"></script> <script src="../../Scripts/MicrosoftMvcValidation.js" type="text/javascript"></script> However I've seen various posts allowing us to utilise jQuery instead with the following code: <script src="https://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.min.js" type="text/javascript"></script> <script src="https://ajax.microsoft.com/ajax/jQuery.Validate/1.6/jQuery.Validate.min.js" type="text/javascript"></script> <script src="<%= Url.Content("~/scripts/MicrosoftMvcJQueryValidation.js") %>" type="text/javascript"></script> However MicrosoftMvcJQueryValidation.js does not ship with the solution and from what I read it should be part of the Futures pack which is no longer available on CodePlex. I managed to find a version alongside jQuery 1.3.2 but it does not work. What is the forward going solution!?

    Read the article

  • No_data_found exception is propagating to outer block also?

    - by Vineet
    In my code i am entering the salary which is not available in employees table and then again inserting duplicate employee_id in primary key column of employee table in exception block where i am handling no data found exception but i do not why No data found exception in the end also? OUTPUT coming: Enter some other sal ORA-01400: cannot insert NULL into ("SCOTT"."EMPLOYEES"."LAST_NAME") ORA-01403: no data found --This should not come according to logic This is the code: DECLARE v_sal number:=&p_sal; v_num number; BEGIN BEGIN select salary INTO v_num from employees where salary=v_sal; EXCEPTION WHEN no_data_found THEN DBMS_OUTPUT.PUT_LINE('Enter some other sal'); INSERT INTO employees (employee_id)values(100) ; END; EXCEPTION WHEN OTHERS THEN DBMS_OUTPUT.PUT_LINE(sqlerrm); END;

    Read the article

  • Trying to drop all tables from my schema with no rows?

    - by Vineet
    I am trying to drop all tables in schema with no rows,but when i am executing this code i am getting an error THis is the code: create or replace procedure tester IS v_count NUMBER; CURSOR emp_cur IS select table_name from user_tables; BEGIN FOR emp_rec_cur IN emp_cur LOOP EXECUTE IMMEDIATE 'select count(*) from '|| emp_rec_cur.table_name INTO v_count ; IF v_count =0 THEN EXECUTE IMMEDIATE 'DROP TABLE '|| emp_rec_cur.table_name; END IF; END LOOP; END tester; ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "identifier": expecting one of: "badfile, byteordermark, characterset, data, delimited, discardfile, exit, fields, fixed, load, logfile, nodiscardfile, nobadfile, nologfile, date_cache, processing, readsize, string, skip, variable" KUP-01008: the bad identifier was: DELIMETED KUP-01007: at line 1 column 9 ORA-06512: at "SYS.ORACLE_LOADER", line 14 ORA-06512: at line 1 ORA-06512: at "SCOTT.TESTER", line 9 ORA-06512: at line 1

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >