Search Results

Search found 32551 results on 1303 pages for 'sql authentication'.

Page 445/1303 | < Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >

  • authentication winform + asp

    - by user156144
    I am building a desktop application that needs to update current user's status frequently. This status will be available as RSS feed. In order to do this, I think I can create a asp.net folder and secure it using form authentication. When the desktop application written as winform in c# needs to update status, it can set WebRequest.Credentials and upload data. Is there any better way of doing this? Thanks

    Read the article

  • psql: FATAL: Ident authentication failed for user "postgres"

    - by morpheous
    I have installed PostgreSQL and pgAdminIII on my Ubuntu Karmic box. I am able to use pgAdminIII successfully (i.e. connect/log on), however when I try to login to the server using the same username/pwd on the command line (using psql), I get the error: psql: FATAL: Ident authentication failed for user "postgres" Does anyone now how to resolve this issue?

    Read the article

  • Digest authentication using LDAP only

    - by Elephant
    Is there a way to validate digest authentication using LDAP only? I.e. if I have the following request from a client (stealed from Wikipedia): GET /dir/index.html HTTP/1.0 Host: localhost Authorization: Digest username="Mufasa", realm="[email protected]", nonce="dcd98b7102dd2f0e8b11d0f600bfb0c093", uri="/dir/index.html", qop=auth, nc=00000001, cnonce="0a4f113b", response="6629fae49393a05397450978507c4ef1", opaque="5ccc069c403ebaf9f0171e9517f40e41" could I validate the user against LDAP, meaning if I don't now user password hence is not able to construct a digest hash to compare with the response?

    Read the article

  • Maven site deploy authentication error with scp

    - by Navi
    I get Auth fail error when running mvn -X site:deploy. org.apache.maven.wagon.authentication.AuthenticationException: Cannot connect. Reason: Auth fail It seems that the correct private key is used and I can scp files normally to the project site directory using scp on Ubuntu. What can be causing this?

    Read the article

  • facebook authentication using qt

    - by user310706
    can authentication for facebook be done without a web browser in qt? the way i want to authenticate is to enter a username and password in text boxes and pass it as parameters in the url? is it possible? please help. thanks in advance.

    Read the article

  • Download File from server that uses Icefaces form based authentication

    - by user266443
    I am a newbie to ICEfaces and i have a requirement where i need to download a document from a given url (http://ipaddress/formexec?objectid=201). This URL uses a form based authentication that is deployed through ICEFaces. i tracked the request of this URL and i get the following line: &ice.submit.partial=false&ice.event.target=loginForm%3Aj_id33&ice.event.captured=loginForm%3Aj_id33 Is there any libraries or code to download the document by successfully passing the username and password.

    Read the article

  • Pass windows authentication username to asp variable

    - by Darren Cook
    Hi, I have a site that processes orders taken by phone into a SQL database. Access to the portal uses Windows Authentication and I would like to pass the username of the order processor along with the order so that I can record who has taken the order. How can I pass the user name to a form element? The pages are written in classic asp. Thanks.

    Read the article

  • Windows Media Service authentication issue

    - by George2
    Hello everyone, I am using Windows Server 2008 R2 with Windows Media Service. At the client side, I want to use Silverlight to play the media file. I am using VSTS 2008 + Silverlight 3 + ASP.Net + .Net 3.5. I want to know how to implement a custom authentication protocol (I have a custom user database, which contains user name and password. I want to enable logged-in user to be able to play through Silverlight)? thanks in advance, George

    Read the article

  • Git through digest proxy authentication

    - by erick2red
    I want to do "git clone" through a proxy server. The issue is my proxy server uses digest authentication. So i can't find neither in git documentation, nor help that someone that already made. I dig through google search and i can't find any helpful results. Thxs.

    Read the article

  • SharePoint authentication via a proxy server

    - by Prabhu
    A client has trouble logging into our webpart (SP 2007). Apparently, his internet connection is via a proxy server. He has no problem logging in to our main website. Authentication from both the webpart and the website is through the same API. Any suggestions? Thanks.

    Read the article

  • Simple check authentication decorator in Python + Pylons

    - by ensnare
    I'd like to write a simple decorator that I can put above functions in my controller to check authentication and re-direct to the login page if the current user is not authenticated. What is the best way to do this? Where should the decorator go? How should I pass cookie info to the decorator? Sample code is greatly appreciated. Thank you!

    Read the article

  • Implementing Forms-Based Authentication

    - by TeaDrinkingGeek
    I have a website for public users, but also have an admin part of about 10 pages, that I need to secure for website admin only. If I implement Forms-Based Authentication on the 10 pages, will it also spread across the public part of the website too? i.e. changes in web.config. I was looking at this example (http://support.microsoft.com/kb/301240) but it looks like it closes off public view for entire application!?! Regards Tea

    Read the article

  • Is there a way to add AD LDS users to an AD Domain Group or allow them domain security rights?

    - by Tom
    I have a web application in which our outside customers need access to run transactions (stored procs on Sql Server) on our domain. We have looked into LDS to keep these users separate from our domain. The problem we are having is allowing the LDS users the AD security rights to access these stored procs. For administration purposes we would like to use an AD group for each transaction (stored proc) which has access to execute. Is there a way to add LDS users to this AD group or allow them the security rights to do this? We have setup LDS and can authenicate an AD user thru to runs these transactions. LDS is running on Server 08 R2. AD is also Server 08 R2. Thanks.

    Read the article

  • Adding a W2008 Authenticating Server to existing W2003 Domain?

    - by spelk
    I have an existing W2003 Domain, simple setup with one DC and a SQL Server (approx 100 users). There are issues with Windows 7 Clients and login scripts and we're now seeing much greater numbers of Windows 7 users turning up as they upgrade their PC/Laptops. What I want to do, is add another Server with W2008 on it, and authenticate the Windows 7 Clients - but leave the W2003 server running as is - to prevent disruption to the network and the existing WinXP users. Is it possible? Any advice as to how do this, without major disruption to the W2003 network?

    Read the article

  • How to enable/disable authentication without password when executing commands as superuser?

    - by 44taka
    On a Fedora 19 system which I set up for somebody a while ago I noticed that no authentication is required when commands are executed as the superuser. So, for example, when running Yum Extender, configuring the firewall or running some command with sudo in the terminal, I am not asked to provide a password. (With graphical applications the authentication dialog pops up for a few milliseconds.) For better security I would like to disable this automatic (authentication-less) assumption of superuser privileges. I do not remember if or how I enabled this authentication without a password. I might have enabled it for convenience for the non-pro user of this machine, but did not do any "fancy" things (like editing config files) to do so. I did not edit the sudoer file. I just checked that. I might have checked a "Do not ask for password again" checkbox or something similar. Whatever I did, I would like to undo it and enforce authentication for superuser tasks again.

    Read the article

  • Real tortoises keep it slow and steady. How about the backups?

    - by Maria Zakourdaev
      … Four tortoises were playing in the backyard when they decided they needed hibiscus flower snacks. They pooled their money and sent the smallest tortoise out to fetch the snacks. Two days passed and there was no sign of the tortoise. "You know, she is taking a lot of time", said one of the tortoises. A little voice from just out side the fence said, "If you are going to talk that way about me I won't go." Is it too much to request from the quite expensive 3rd party backup tool to be a way faster than the SQL server native backup? Or at least save a respectable amount of storage by producing a really smaller backup files?  By saying “really smaller”, I mean at least getting a file in half size. After Googling the internet in an attempt to understand what other “sql people” are using for database backups, I see that most people are using one of three tools which are the main players in SQL backup area:  LiteSpeed by Quest SQL Backup by Red Gate SQL Safe by Idera The feedbacks about those tools are truly emotional and happy. However, while reading the forums and blogs I have wondered, is it possible that many are accustomed to using the above tools since SQL 2000 and 2005.  This can easily be understood due to the fact that a 300GB database backup for instance, using regular a SQL 2005 backup statement would have run for about 3 hours and have produced ~150GB file (depending on the content, of course).  Then you take a 3rd party tool which performs the same backup in 30 minutes resulting in a 30GB file leaving you speechless, you run to management persuading them to buy it due to the fact that it is definitely worth the price. In addition to the increased speed and disk space savings you would also get backup file encryption and virtual restore -  features that are still missing from the SQL server. But in case you, as well as me, don’t need these additional features and only want a tool that performs a full backup MUCH faster AND produces a far smaller backup file (like the gain you observed back in SQL 2005 days) you will be quite disappointed. SQL Server backup compression feature has totally changed the market picture. Medium size database. Take a look at the table below, check out how my SQL server 2008 R2 compares to other tools when backing up a 300GB database. It appears that when talking about the backup speed, SQL 2008 R2 compresses and performs backup in similar overall times as all three other tools. 3rd party tools maximum compression level takes twice longer. Backup file gain is not that impressive, except the highest compression levels but the price that you pay is very high cpu load and much longer time. Only SQL Safe by Idera was quite fast with it’s maximum compression level but most of the run time have used 95% cpu on the server. Note that I have used two types of destination storage, SATA 11 disks and FC 53 disks and, obviously, on faster storage have got my backup ready in half time. Looking at the above results, should we spend money, bother with another layer of complexity and software middle-man for the medium sized databases? I’m definitely not going to do so.  Very large database As a next phase of this benchmark, I have moved to a 6 terabyte database which was actually my main backup target. Note, how multiple files usage enables the SQL Server backup operation to use parallel I/O and remarkably increases it’s speed, especially when the backup device is heavily striped. SQL Server supports a maximum of 64 backup devices for a single backup operation but the most speed is gained when using one file per CPU, in the case above 8 files for a 2 Quad CPU server. The impact of additional files is minimal.  However, SQLsafe doesn’t show any speed improvement between 4 files and 8 files. Of course, with such huge databases every half percent of the compression transforms into the noticeable numbers. Saving almost 470GB of space may turn the backup tool into quite valuable purchase. Still, the backup speed and high CPU are the variables that should be taken into the consideration. As for us, the backup speed is more critical than the storage and we cannot allow a production server to sustain 95% cpu for such a long time. Bottomline, 3rd party backup tool developers, we are waiting for some breakthrough release. There are a few unanswered questions, like the restore speed comparison between different tools and the impact of multiple backup files on restore operation. Stay tuned for the next benchmarks.    Benchmark server: SQL Server 2008 R2 sp1 2 Quad CPU Database location: NetApp FC 15K Aggregate 53 discs Backup statements: No matter how good that UI is, we need to run the backup tasks from inside of SQL Server Agent to make sure they are covered by our monitoring systems. I have used extended stored procedures (command line execution also is an option, I haven’t noticed any impact on the backup performance). SQL backup LiteSpeed SQL Backup SQL safe backup database <DBNAME> to disk= '\\<networkpath>\par1.bak' , disk= '\\<networkpath>\par2.bak', disk= '\\<networkpath>\par3.bak' with format, compression EXECUTE master.dbo.xp_backup_database @database = N'<DBName>', @backupname= N'<DBName> full backup', @desc = N'Test', @compressionlevel=8, @filename= N'\\<networkpath>\par1.bak', @filename= N'\\<networkpath>\par2.bak', @filename= N'\\<networkpath>\par3.bak', @init = 1 EXECUTE master.dbo.sqlbackup '-SQL "BACKUP DATABASE <DBNAME> TO DISK= ''\\<networkpath>\par1.sqb'', DISK= ''\\<networkpath>\par2.sqb'', DISK= ''\\<networkpath>\par3.sqb'' WITH DISKRETRYINTERVAL = 30, DISKRETRYCOUNT = 10, COMPRESSION = 4, INIT"' EXECUTE master.dbo.xp_ss_backup @database = 'UCMSDB', @filename = '\\<networkpath>\par1.bak', @backuptype = 'Full', @compressionlevel = 4, @backupfile = '\\<networkpath>\par2.bak', @backupfile = '\\<networkpath>\par3.bak' If you still insist on using 3rd party tools for the backups in your production environment with maximum compression level, you will definitely need to consider limiting cpu usage which will increase the backup operation time even more: RedGate : use THREADPRIORITY option ( values 0 – 6 ) LiteSpeed : use  @throttle ( percentage, like 70%) SQL safe :  the only thing I have found was @Threads option.   Yours, Maria

    Read the article

  • Speed up SQL Server queries with PREFETCH

    - by Akshay Deep Lamba
    Problem The SAN data volume has a throughput capacity of 400MB/sec; however my query is still running slow and it is waiting on I/O (PAGEIOLATCH_SH). Windows Performance Monitor shows data volume speed of 4MB/sec. Where is the problem and how can I find the problem? Solution This is another summary of a great article published by R. Meyyappan at www.sqlworkshops.com.  In my opinion, this is the first article that highlights and explains with working examples how PREFETCH determines the performance of a Nested Loop join.  First of all, I just want to recall that Prefetch is a mechanism with which SQL Server can fire up many I/O requests in parallel for a Nested Loop join. When SQL Server executes a Nested Loop join, it may or may not enable Prefetch accordingly to the number of rows in the outer table. If the number of rows in the outer table is greater than 25 then SQL will enable and use Prefetch to speed up query performance, but it will not if it is less than 25 rows. In this section we are going to see different scenarios where prefetch is automatically enabled or disabled. These examples only use two tables RegionalOrder and Orders.  If you want to create the sample tables and sample data, please visit this site www.sqlworkshops.com. The breakdown of the data in the RegionalOrders table is shown below and the Orders table contains about 6 million rows. In this first example, I am creating a stored procedure against two tables and then execute the stored procedure.  Before running the stored proceudre, I am going to include the actual execution plan. --Example provided by www.sqlworkshops.com --Create procedure that pulls orders based on City --Do not forget to include the actual execution plan CREATE PROC RegionalOrdersProc @City CHAR(20) AS BEGIN DECLARE @OrderID INT, @OrderDetails CHAR(200) SELECT @OrderID = o.OrderID, @OrderDetails = o.OrderDetails       FROM RegionalOrders ao INNER JOIN Orders o ON (o.OrderID = ao.OrderID)       WHERE City = @City END GO SET STATISTICS time ON GO --Example provided by www.sqlworkshops.com --Execute the procedure with parameter SmallCity1 EXEC RegionalOrdersProc 'SmallCity1' GO After running the stored procedure, if we right click on the Clustered Index Scan and click Properties we can see the Estimated Numbers of Rows is 24.    If we right click on Nested Loops and click Properties we do not see Prefetch, because it is disabled. This behavior was expected, because the number of rows containing the value ‘SmallCity1’ in the outer table is less than 25.   Now, if I run the same procedure with parameter ‘BigCity’ will Prefetch be enabled? --Example provided by www.sqlworkshops.com --Execute the procedure with parameter BigCity --We are using cached plan EXEC RegionalOrdersProc 'BigCity' GO As we can see from the below screenshot, prefetch is not enabled and the query takes around 7 seconds to execute. This is because the query used the cached plan from ‘SmallCity1’ that had prefetch disabled. Please note that even if we have 999 rows for ‘BigCity’ the Estimated Numbers of Rows is still 24.   Finally, let’s clear the procedure cache to trigger a new optimization and execute the procedure again. DBCC freeproccache GO EXEC RegionalOrdersProc 'BigCity' GO This time, our procedure runs under a second, Prefetch is enabled and the Estimated Number of Rows is 999.   The RegionalOrdersProc can be optimized by using the below example where we are using an optimizer hint. I have also shown some other hints that could be used as well. --Example provided by www.sqlworkshops.com --You can fix the issue by using any of the following --hints --Create procedure that pulls orders based on City DROP PROC RegionalOrdersProc GO CREATE PROC RegionalOrdersProc @City CHAR(20) AS BEGIN DECLARE @OrderID INT, @OrderDetails CHAR(200) SELECT @OrderID = o.OrderID, @OrderDetails = o.OrderDetails       FROM RegionalOrders ao INNER JOIN Orders o ON (o.OrderID = ao.OrderID)       WHERE City = @City       --Hinting optimizer to use SmallCity2 for estimation       OPTION (optimize FOR (@City = 'SmallCity2'))       --Hinting optimizer to estimate for the currnet parameters       --option (recompile)       --Hinting optimize not to use histogram rather       --density for estimation (average of all 3 cities)       --option (optimize for (@City UNKNOWN))       --option (optimize for UNKNOWN) END GO Conclusion, this tip was mainly aimed at illustrating how Prefetch can speed up query execution and how the different number of rows can trigger this.

    Read the article

  • Customize Entity Framework SSDL &amp; SQL Generation

    - by Dane Morgridge
    In almost every talk I have done on Entity Framework I get questions on how to do custom SSDL or SQL when using model first development.  Quite a few of these questions have required custom changes to the SSDL, which of course can be a problem if it is getting auto generated.  Luckily, there is a tool that can help.  In the Visual Studio Gallery on MSDN, there is the Entity Designer Database Generation Power Pack. You have the ability to select different generation strategies and it also allows you to inject custom T4 Templates into the generation workflow so that you can customize the SSDL and SQL generation.  When you select to generate a database from a model the dialog is replaced by one with more options:   You can clone the individual workflow for either the current project or current machine.  The templates are installed at “C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\Extensions\Microsoft\Entity Framework Tools\DBGen” on my local machine and you can make a copy of any template there.  If you clone the strategy and open it up, you will get the following workflow: Each item in the sequence is defining the execution of a T4 template.  The XAML for the workflow is listed below so you can see where the T4 files are defined.  You can simply make a copy of an existing template and make what ever changes you need.   1: <Activity x:Class="GenerateDatabaseScriptWorkflow" ... > 2: <x:Members> 3: <x:Property Name="Csdl" Type="InArgument(sde:EdmItemCollection)" /> 4: <x:Property Name="ExistingSsdl" Type="InArgument(s:String)" /> 5: <x:Property Name="ExistingMsl" Type="InArgument(s:String)" /> 6: <x:Property Name="Ssdl" Type="OutArgument(s:String)" /> 7: <x:Property Name="Msl" Type="OutArgument(s:String)" /> 8: <x:Property Name="Ddl" Type="OutArgument(s:String)" /> 9: <x:Property Name="SmoSsdl" Type="OutArgument(ss:SsdlServer)" /> 10: </x:Members> 11: <Sequence> 12: <dbtk:ProgressBarStartActivity /> 13: <dbtk:CsdlToSsdlTemplateActivity SsdlOutput="[Ssdl]" TemplatePath="$(VSEFTools)\DBGen\CSDLToSSDL_TPT.tt" /> 14: <dbtk:CsdlToMslTemplateActivity MslOutput="[Msl]" TemplatePath="$(VSEFTools)\DBGen\CSDLToMSL_TPT.tt" /> 15: <ded:SsdlToDdlActivity ExistingSsdlInput="[ExistingSsdl]" SsdlInput="[Ssdl]" DdlOutput="[Ddl]" /> 16: <dbtk:GenerateAlterSqlActivity DdlInputOutput="[Ddl]" DeployToScript="True" DeployToDatabase="False" /> 17: <dbtk:ProgressBarEndActivity ClosePopup="true" /> 18: </Sequence> 19: </Activity>   So as you can see, this tool enables you to make some pretty heavy customizations to how the SSDL and SQL get generated.  You can get more info and the tool can be downloaded from: http://visualstudiogallery.msdn.microsoft.com/en-us/df3541c3-d833-4b65-b942-989e7ec74c87.  There is a comments section on the site so make sure you let the team know what you like and what you don’t like.  Enjoy!

    Read the article

  • SQL Saturday and Exploring Data Privacy

    - by Johnm
    I have been highly impressed with the growth of the SQL Saturday phenomenon. It seems that an announcement for a new wonderful event finds its way to my inbox on a daily basis. I have had the opportunity to attend the first of the SQL Saturday's for Tampa, Chicago, Louisville and recently my home town of Indianapolis. It is my hope that there will be many more in my future. This past weekend I had the honor of being selected to speak amid a great line up of speakers at SQL Saturday #82 in Indianapolis. My session topic/title was "Exploring Data Privacy". Below is a brief synopsis of my session: Data Privacy in a Nutshell        - Definition of data privacy        - Examples of personally identifiable data        - Examples of Sensitive data Laws and Stuff        - Various examples of laws, regulations and policies that influence the definition of data privacy        - General rules of thumb that encompasses most laws Your Data Footprint        - Who has personal information about you?        - What are you exchanging data privacy for?        - The amazing resilience of data        - The cost of data loss Weapons of Mass Protection       - Data classification       - Extended properties       - Database Object Schemas       - An extraordinarily brief introduction of encryption       - The amazing data professional  <-the most important point of the entire session! The subject of data privacy is one that is quickly making its way to the forefront of the mind of many data professionals. Somewhere out there someone is storing personally identifiable and other sensitive data about you. In some cases it is kept reasonably secure. In other cases it is kept in total exposure without the consideration of its potential of damage to you. Who has access to it and how is it being used? Are we being unnecessarily required to supply sensitive data in exchange for products and services? These are just a few questions on everyone's mind. As data loss events of grand scale hit the headlines in a more frequent succession, the level of frustration and urgency for a solution increases. I assembled this session with the intent to raise awareness of sensitive data and remind us all that we, data professionals, are the ones who have the greatest impact and influence on how sensitive data is regarded and protected. Mahatma Gandhi once said "Be the change you want to see in the world." This is guidance that I keep near to my heart as I approached this topic of data privacy.

    Read the article

  • IIS 401.3 - Unauthorized on only 1 server out of 3 set up for network load balancing

    - by Tony
    Over the weekend our Server Admin set up two virtual Windows 2008 machines with IIS installed and set them up under NLB. I came in and changed the application pool the website was running under to our domain account that has proper access to the database and the file share hosting our .NET web application Sitefinity, and changed it to .NET 4 Integrated. NLB and everything was running fine on both servers. He brought up the third server for our cluster on Tuesday and I performed the same actions.. The only difference was that I was given admin rights for the third server so I could set it up remotely instead of going to his office. He has full control over the share and NTFS perms on \\hostname\Sitefinity and I believe I only had read access. I pointed the web site to the same \\hostname\Sitefinity\sitename share that the others were on and the authentication/authorization test settings passed. I hit the site from http://localhost (like I did successfully from the other two before trying the cluster's IP address) and I received a HTTP Error 401.3 - Unauthorized. I've verified many times that the application pool is running under the same service account. I tried hitting just a simple test.htm.. works fine on both of the first two servers but I get the same 401.3 on the third. I copied my dev project to the local inetpub directory and re-pointed the website and that ran perfectly. I turned on Failed Request Tracing and it acts like it's still running the local IUSR account I guess (instead of my domain account)? Here is an excerpt of the File Cache Access Start and the error from the trace: FileName \\hostname\sitefinity\sitename\test.htm UserName IUSR DomainName NT AUTHORITY ---------- Successful false FileFromCache false FileAddedToCache false FileDirmoned true LastModCheckErrorIgnored true ErrorCode 2147942405 LastModifiedTime ErrorCode Access is denied. (0x80070005) ---------- ModuleName IIS Web Core Notification 2 HttpStatus 401 HttpReason Unauthorized HttpSubStatus 3 ErrorCode 2147942405 ConfigExceptionInfo Notification AUTHENTICATE_REQUEST ErrorCode Access is denied. (0x80070005) ---------- My personal AD account was then granted read/write perms to the share so I created a new application pool and set the site under it in case there was an issue with the application pool but no success. I created another under my own account and it still failed. It just seems like maybe it's not trying to access the files under the account my application pools are running under although that's the only way I've done things before. I set the Physicial Path Credentials in Advanced Settings on the site to the service account and it threw a 500 error of some sort so I assume that's not the answer (and I don't have to do it on the other servers). It's like somehow I'm trying to force impersonation on the IUSR account or something?

    Read the article

< Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >