Search Results

Search found 10815 results on 433 pages for 'stored procedure'.

Page 131/433 | < Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >

  • Installing g77 in Ubuntu 12.04 LTS (32bit)

    - by pksahani
    I am using Ubuntu 12.04 LTS (32bit-i386) in my desktop PC. I need g77 compiler for some specific applications. The app can only be installed after having g77 compiler. This specific app is designed based on g77 fortran compiler and can't be used with gfortran which is the standard available compiler in 12.04 LTS. And guide me the procedure to install g77 in 12.04. I have been trying apt-get update & apt-get install g77 after changing the sources.list file. After processing I am able to install g77 but when i try to compile a fortran program, it shows error /usr/bin/ld: cannot find crt1.o: No such file or directory /usr/bin/ld: cannot find crti.o: No such file or directory /usr/bin/ld: cannot find -lgcc_s collect2: ld returned 1 exit status Please help me. I m struggling a lot to fix this.

    Read the article

  • Installed Ubuntu 14.04. How do I get it working with my new AMD R9 280X. When I run it I get garbage on the display

    - by user289455
    I installed Ubuntu 14.04 on my Gigabyte A88X FM2+ motherboard and A10 7700K processor. Everything worked perfectly. This system is dual booted with Windows 8.1 for gaming and I installed subsequently I installed an AMD r9 280X gpu. Windows was easy. It kept working with the new card installed and I just needed to download and instal the driver. Reboot and it's done. Ubuntu boots into a blank display. I enter the password and can log in. The menu on the left and bar at the top are visible but when I try to open an app, I get blocks and rubbish on the display. My preference is not to remove the card. So, my question is: What would be the procedure I could follow where I can log in using a text based terminal and download and install the AMD drivers.

    Read the article

  • Broadcom wireless card disappears after restart

    - by user10854
    I used 'additional drivers' to install 'Broadcom STA wireless driver' and it returns an error. Within jockey.log it says the following numerous times. 2011-02-14 21:24:06,945 DEBUG: BroadcomWLHandler enabled(): kmod disabled, bcm43xx: blacklisted, b43: blacklisted, b43legacy: blacklisted After it returns the error the network card will work temporarily until I restart the laptop. When I restart I got to go through the procedure again of trying to activate the driver, returns an error however it works temporarily. The network card is as follows on a Dell Inspiron 1545: Broadcom Corporation BCM4312 802.11b/g LP-PHY [14e4:4315] Rev 01 I have been trying to solve this myself for many hours. Any help is appreciated/

    Read the article

  • Google accusé de manipuler son référencement par un site français qui dépose une nouvelle plainte à Bruxelles

    Google accusé de manipuler son référencement Par un site français qui dépose une nouvelle plainte à Bruxelles Mise à jour du 22/02/11 Alors que l'affaire suit son cours après le dépôt de sa première plainte contre Google, le site français Ejustice.fr aurait ouvert une deuxième procédure à l'encontre du moteur de recherche. La plus grosse accusation de Ejustice, révélée par Les Echos datés d'aujourd'hui, concerne la manipulation du référencement des résultats du moteur de Google. Google aurait ainsi dé-référencé le site français, ainsi que ses filiales comme Eguides.fr, en représaille à la première plainte. Curieusement (à en cr...

    Read the article

  • reading parameters and files on browser, looking how to execute on server

    - by jbcolmenares
    I have a site done in Rails, which uses javascript to load files and generate forms for the user to input certain information. Those files and parameters are then to be used in a fortran code on the server. When the UI was on the server (using Qt), I would create a parameters file and execute the fortran code using threads so I wouldn't block the computer. Now that is web-based, I need to make the server and browser talk. What's the procedure for that? where should I start looking? I'm already using rails + javascript. I need that extra tool to do the talking, and no idea where to start.

    Read the article

  • Get and install Nvidia GeForce 8400 GS driver

    - by williepabon
    Recently, I changed my OS kernel from 10.04 to 11.10 (bugs), but after doing it, the video driver for the 8400 GS disappeared (was there in 10.04). I worked out the same procedure I did to install it in 10.04, mainly, sudo apt-get --purge remove nvidia-current sudo apt-get --purge autoremove sudo add-apt-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get -y install nvidia-current but it didn't work even though the commands seemed to install the driver without problems. Right now my machine is working with the standard drivers, as shown. williepabon@WP-WrkStation:~$ sudo lshw -C display [sudo] password for williepabon: *-display description: VGA compatible controller product: nVidia Corporation vendor: nVidia Corporation physical id: 0 bus info: pci@0000:05:00.0 version: a2 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list rom configuration: driver=nouveau latency=0 resources: irq:16 memory:de000000-deffffff memory:c0000000-cfffffff memory:d0000000-d1ffffff ioport:cc80(size=128) memory:dfc00000-dfc7ffff Any suggestions to correct the problem? Thanks

    Read the article

  • SQL Server Data Type Precedence

    I am executing a simple query/stored procedure from my application against a large table and it's taking a long time to execute. The column I'm using in my WHERE clause is indexed and it's very selective. The search column is not wrapped in a function so that's not the issue. What could be going wrong? Schedule Azure backupsRed Gate’s Cloud Services makes it simple to create and schedule backups of your SQL Azure databases to Azure blob storage or Amazon S3. Try it for free today.

    Read the article

  • which one is the safe site to buy cheap wildstar gold?

    - by user50866
    Wsoplat.com is a professional online shop which provide the cheapest wildstar gold, the fastest delivery, the best 24/7 online service for the players. Secure Guarantee cheap wildstar gold offered by wsoplat.com are reliable sourced, safe and honored. Lowest Price We are constantly trying to offer the lowest prices on Wildstar Gold for our loyal customers. Covenient shopping procedure & secure delivery method We are very experienced in this business. Every order is processed both smoothly and efficiently. Friendly Service and Instant Delivery Wsoplat 's delivery department work 24/7/365, We have professional and friendly customer service operators, they can help you buy wildstar gold, accounts, items and more. 5% discount code to buy WildStar gold in wsoplat - WSOPLAT http://www.wsoplat.com/

    Read the article

  • Wait random number of minutes

    - by TiborKaraszi
    Why on earth would you want to do that? you ask. Say you have a job that is scheduled to start at the same time over a number of servers. This might be because you have an SQL Server Master/Target server environment (MSX/TSX) or you quite simply script a job and execute that script on several servers. You probably want to spread the load on your SAN and virtual machine host a bit. This is the exact reason I use this procedure. I frequently use MSX servers and I usually add a job step (executing this...(read more)

    Read the article

  • Programming Language for RF scanner [closed]

    - by Sid
    I am a Software Developer mostly on Windows System. I have been tasked to create an App Program for an RF Barcode Scanner. Which I don't have any experience in this kind of software. I am open to learn new Programming Languages to develop the software for this barcode scanner. From which I see on our meeting. It uses Windows or Windows Mobile as OS. Can someone direct me to a tut site or just tell me what language I should use? Requirements is mostly, Log In, UI Entry of Details + Barcode, Process Button. Connection To SQL Server via Wifi Access Point to call the procedure afterwards. I am not developing a Barcode scanner, I am developing a UI Software like a Mobile App. I tried to google but I dont know what exact keywords I should search for. My question is, what PL is mostly used to develop this kind of softwares.

    Read the article

  • Apple pourrait écoper d'une plainte d'antitrust, une conséquence de son opposition à Flash ?

    Apple pourrait écoper d'une plainte d'antitrust, une conséquence de son opposition à Flash ? Des informations assez sérieuses circulent et laissent entendre que les autorités américaines s'intéresseraient de près à Apple et envisageraient d'ouvrir une enquête pour abus de position dominante d'ici quelques jours. La Federal Trade Commission et le département de la Justice se disputeraient l'honneur de débuter la procédure. Il est important de ne pas oublier que l'ouverture d'une enquête n'implique pas forcément une condamnation. Comment Apple en est donc arrivé là ? Il y a d'abord eu la modification de sa politique de développement il y a quelques semaines, qui oblige désormais tous les programmeurs pour iPhone/iPad à...

    Read the article

  • What should I do when Ubuntu freezes?

    - by ændrük
    All operating systems freeze sometimes, and Ubuntu is no exception. What should I do to regain control when... just one program stops responding? nothing at all responds to mouse clicks or key presses? the mouse stops moving entirely? In what order should I try various solutions before deciding to pull the power plug? What should I do when starting up Ubuntu fails? Is there a diagnostic procedure I can follow?

    Read the article

  • Business Layer Design in J2EE Project

    - by user63157
    Currently the project on which I am working is being developed with Spring, Hibernate and struts. The business layer consists of simple java beans with no behavior in them only properties and getter and setter methods, the services are written on them which operates on them and call DAO layer methods and all. My questions is that is it object oriented way of designing or simply the procedure way in which the data and the functions on which they operate are not together. Please provide your thoughts and inputs on how the business logic is design and implemented in j2ee application, is the domain model contains business methods or are they simply dumb objects which have only data and services written on to them.

    Read the article

  • design for interruptable operations

    - by tpaksu
    I couldn't find a better topic but here it is; 1) When user clicks a button, code starts t work, 2) When another button is clicked, it would stop doing whatever it does and start to run the second button's code, 3) Or with not user interaction, an electrical power down detected from a connected device, so our software would cancel the current event and start doing the power down procedure. How is this design mostly applied to code? I mean "stop what you are doing" part? If you would say events, event handlers etc. how do you bind a condition to the event? and how do you tell the program without using laddered if's to end it's process? method1(); if (powerdown) return; method2(); if (powerdown) return; etc.

    Read the article

  • 12.04 Login gone wrong

    - by Mark H
    I seem to have a fault in the login procedure somewhere. When I boooted up I found that if I selected Guest I could use the computer. If I selected my administrator account I was taken straight to the terminal. So I opened a guest session, went to user settings, and unlocked my admin account (it accepted the password!), and amended it to show "password None" and automatic login. I now find that when I boot up I am taken straight to the terminal - if I exit terminal I get the login screen - if I select the administrator login I go back to terminal - if i select guest I can use the PC, and if I select mu normal user account I can use the PC. So I cannot login as an administrator - so admin functions such as update are no longer accessible. Sorry to be so long winde but I am stuck. Can anyone suggest anything - I am a beginner with this

    Read the article

  • What is the Big-Oh nitation for this? [closed]

    - by laniam
    procedure quartersearch (x : inter, a1, a2, ?, an) : increasing integers) i := 1{i is left endpoint of search interval} ? j := n {j is right endpoint of search interval} while i < j begin m :=? ?(i + j)? / 4 if x am then ?i := m+1 else if x am then ?m : = ?(i + j)? / 4 else if x am then m := 2 ?(i + j)? /4 else ?if x am then m := 3 ?(i + j)? /4 else j := m end if x = ai then location := i else location := 0

    Read the article

  • File recovery of windows after installing ubuntu

    - by user282619
    In one of my computers I installed ubuntu 14.04. Actually , I tried to dual-boot in along with windows 7 and I tried to do the partition during the ubuntu installation procedure. Now all of my 500 GB hard disk has been occupied by ubuntu and I cannot access windows. Since, I have all my documents in windows and I want to recover it. Will I be able to recovery my files of windows ? If yes, can you please help me with the solution ?

    Read the article

  • What is the easiest and fastest way to display an SDL_Surface in a window with SDL2?

    - by Semmu
    I would like to have an SDL_Surface representing the contents of the window, just like in the old days with SDL1.2. What is the best and fastest way to do it in SDL2? What I found is that I need an SDL_Window, an SDL_Renderer for that window, an SDL_Texture to render, and an SDL_Surface to create a texture from. This seems a bit too much to me, since I just want to display a single image on the screen. Not to mention the impact on the performance. On my machine (Lenovo Y510p laptop) this whole procedure takes 9ms, without any memory allocation, only using pre-allocated variables and totally black SDL_Surface. Is there a way I could speed up things?

    Read the article

  • 11g???????????????

    - by Liu Maclean(???)
    11g???????????????? ??11g?auto stats gather job????auto task?,???10g?????????: SQL> select client_name,status from DBA_AUTOTASK_CLIENT; CLIENT_NAME STATUS ---------------------------------------------------------------- -------- auto optimizer stats collection ENABLED auto space advisor ENABLED sql tuning advisor ENABLED begin DBMS_AUTO_TASK_ADMIN.DISABLE(client_name => 'auto optimizer stats collection', operation => NULL, window_name => NULL); end; / PL/SQL procedure successfully completed. SQL> select client_name,status from DBA_AUTOTASK_CLIENT; CLIENT_NAME STATUS ---------------------------------------------------------------- -------- auto optimizer stats collection DISABLED auto space advisor ENABLED sql tuning advisor ENABLED

    Read the article

  • ????Oracle EBS R12 on Exadata V2 ,MMA and Hight Performance

    - by longchun.zhu
    ???????? ?????,???, ????????hands-on ??? ??????: 1. Oracle EBS R12 on Database Machine MAA & Performance Architecture. 2. Oracle EBS R12 Single Instance Node Deployment Procedure. 2.Start Rapid Install Wizard. 3. Oracle EBS R12 Single Instance Node Chinese Patch Update. 4.Applying Patches. 5.Upgrade Application Database Version to 11g Release 2. 6.Database Upgrade 7. Deploy Clone Application Database to Sun Oracle Database Machine. 8.Migrate Application Database File System to Exadata ASM Storage. 9.Covert Application Database Single Instance to RAC.. 10.Configure High Availability & High Performance Architecture with Exadata.

    Read the article

  • BizTalk Cross Reference Data Management Strategy

    - by charlie.mott
    Article Source: http://geekswithblogs.net/charliemott This article describes an approach to the management of cross reference data for BizTalk.  Some articles about the BizTalk Cross Referencing features can be found here: http://home.comcast.net/~sdwoodgate/xrefseed.zip http://geekswithblogs.net/michaelstephenson/archive/2006/12/24/101995.aspx http://geekswithblogs.net/charliemott/archive/2009/04/20/value-vs.id-cross-referencing-in-biztalk.aspx Options Current options to managing this data include: Maintaining xml files in the format that can be used by the out-of-the-box BTSXRefImport.exe utility. Use of user interfaces that have been developed to manage this data: BizTalk Cross Referencing Tool XRef XML Creation Tool However, there are the following issues with the above options: The 'BizTalk Cross Referencing Tool' requires a separate database to manage.  The 'XRef XML Creation' tool has no means of persisting the data settings. The 'BizTalk Cross Referencing tool' generates integers in the common id field. I prefer to use a string (e.g. acme.country.uk). This is more readable. (see naming conventions below). Both UI tools continue to use BTSXRefImport.exe.  This utility replaces all xref data. This can be a problem in continuous integration environments that support multiple clients or BizTalk target instances.  If you upload the data for one client it would destroy the data for another client.  Yet in TFS where builds run concurrently, this would break unit tests. Alternative Approach In response to these issues, I instead use simple SQL scripts to directly populate the BizTalkMgmtDb xref tables combined with a data namepacing strategy to isolate client data. Naming Conventions All data keys use namespace prefixing.  The pattern will be <companyName>.<data Type>.  The naming conventions will be to use lower casing for all items.  The data must follow this pattern to isolate it from other company cross-reference data.  The table below shows some sample data. (Note: this data uses the 'ID' cross-reference tables.  the same principles apply for the 'value' cross-referencing tables). Table.Field Description Sample Data xref_AppType.appType Application Types acme.erp acme.portal acme.assetmanagement xref_AppInstance.appInstance Application Instances (each will have a corresponding application type). acme.dynamics.ax acme.dynamics.crm acme.sharepoint acme.maximo xref_IDXRef.idXRef Holds the cross reference data types. acme.taxcode acme.country xref_IDXRefData.CommonID Holds each cross reference type value used by the canonical schemas. acme.vatcode.exmpt acme.vatcode.std acme.country.usa acme.country.uk xref_IDXRefData.AppID This holds the value for each application instance and each xref type. GBP USD SQL Scripts The data to be stored in the BizTalkMgmtDb xref tables will be managed by SQL scripts stored in a database project in the visual studio solution. File(s) Description Build.cmd A sqlcmd script to deploy data by running the SQL scripts below.  (This can be run as part of the MSBuild process).   acme.purgexref.sql SQL script to clear acme.* data from the xref tables.  As such, this will not impact data for any other company. acme.applicationInstances.sql   SQL script to insert application type and application instance data.   acme.vatcode.sql acme.country.sql etc ...  There will be a separate SQL script to insert each cross-reference data type and application specific values for these types.

    Read the article

  • Build Explorer version 1.1 for Visual Studio Team Explorer is released

    - by terje
    Our free extension to Visual Studio , the folder based Build Explorer Version 1.1 has now been released, and uploaded to the Visual Studio Gallery and Codeplex. We have collected up a few changes and some bugs, as follows: Changes: Queue Default Builds can now be optionally fully enabled, fully disabled or enabled just for leaf nodes (=disabled for folders).  If you got a large number of builds it was pretty scary to be able to launch all of them with just one click.  However, it is nice to avoid having the dialog box up when you want to just run off a single build.  That’s the reasoning between the 3rd choice here. Auto fill-in of the builds at start up and refresh  This was a request that came up a lot, and which was also irritating to us.  When the Team Project is opened, the Build explorer will start by itself and fill up it’s tree. So you don’t need to click the node anymore. There was also quite a bit of flashing when the tree filled up, this has been reduced to just a single top level fill before it collapses the node. The speed of the buildup of the tree has also been increased. The “All Build Definitions” node is now shown on top of the list Login box appeared in certain cross domain situations. This was a fix for the TF30063 authentication problem we had in the beginning.  Hopefully the new code has that fixed properly so that both the login box and the TF30063 are gone forever.  Our testing so far seems to indicate it works.  If anyone gets a real problem here there are two workarounds: 1) Turn off the auto refresh to reduce the issue. If this doesn’t fix it, then 2) please reinstall the former version (go to the codeplex download site if you don’t have it anymore)  Write a comment to this blog post with a description of what happens, and I will send a temporary fix asap. Bug fixes: The folder name matching was case sensitive, so “Application.CI” and “application.CI” created two different folders.  View all builds not shown for leaf odes, and view builds didn’t work in all cases.  There was some inconsistencies here which have been fixed. Partly fixed:  The context menu to queue a new build for disabled builds should be removed, but that was a difficult one, and is still on the list, but the command will not do anything for a disabled build. Using the Queue Default Builds on a folder, and if it had some disabled builds below an error box appeared and ruined the whole experience. As a result of these fixes there has been introduced some new options, as shown below:   The two first settings, the Separator symbol and the options for how to handle Queuing of default builds are set per Team Project, and is stored in the TFS source control under the BuildProcessTemplates folder, with the name Inmeta.VisualStudio.BuildExplorer.Settings.xml The next two settings need some explanations.  They handle the behavior for the auto update of the build folders.  First, these are stored in the local registry per user, at the key HKEY_CURRENT_USER/Software\Inmeta\BuildExplorer. The first option Use Timed Refresh at Startup, if turned off, you will need to click the node as it is done in Version 1.0.  The second option is a timed value, the time after the Build explorer node is created and until the scanning of the Build folders start.  It is assumed that this is enough, and the tests so far indicates this.  If you have very many builds and you see that the explorer don’t get them all, try to increase this value, and of course, notify me of your case, either here or on the Visual Gallery site.

    Read the article

  • Romanian parter Omnilogic Delivers “No Limits” Scalability, Performance, Security, and Affordability through Next-Generation, Enterprise-Grade Engineered Systems

    - by swalker
    Omnilogic SRL is a leading technology and information systems provider in Romania and central and Eastern Europe. An Oracle Value-Added Distributor Partner, Omnilogic resells Oracle software, hardware, and engineered systems to Oracle Partner Network members and provides specialized training, support, and testing facilities. Independent software vendors (ISVs) also use Omnilogic’s demonstration and testing facilities to upgrade the performance and efficiency of their solutions and those of their customers by migrating them from competitor technologies to Oracle platforms. Omnilogic also has a dedicated offering for ISV solutions, based on Oracle technology in a hosting service provider model. Omnilogic wanted to help Oracle Partners and ISVs migrate solutions to Oracle Exadata and sell Oracle Exadata to end-customers. It installed Oracle Exadata Database Machine X2-2 Quarter Rack at its data center to create a demonstration and testing environment. Demonstrations proved that Oracle Exadata achieved processing speeds up to 100 times faster than competitor systems, cut typical back-up times from 6 hours to 20 minutes, and stored 10 times more data. Oracle Partners and ISVs learned that migrating solutions to Oracle Exadata’s preconfigured, pre-integrated hardware and software can be completed rapidly, at low cost, without business disruption, and with reduced ongoing operating costs. Challenges A word from Omnilogic “Oracle Exadata is the new killer application—the smartest solution on the market. There is no competition.” – Sorin Dragomir, Chief Operating Officer, Omnilogic SRL Enable Oracle Partners in Romania and central and eastern Europe to achieve Oracle Exadata Ready status by providing facilities to test and optimize existing applications and build real-life proofs of concept (POCs) for new solutions on Oracle Exadata Database Machine Provide technical support and demonstration facilities for ISVs migrating their customers’ solutions from competitor technologies to Oracle Exadata to maximize performance, scalability, and security; optimize hardware and datacenter space; cut maintenance costs; and improve return on investment Demonstrate power of Oracle Exadata’s high-performance, high-capacity engineered systems for customer-facing businesses, such as government organizations, telecommunications, banking and insurance, and utility companies, which typically require continuous availability to support very large data volumes Showcase Oracle Exadata’s unchallenged online transaction processing (OLTP) capabilities that cut application run times to provide unrivalled query turnaround and user response speeds while significantly reducing back-up times and eliminating risk of unplanned outages Capitalize on providing a world-class training and demonstration environment for Oracle Exadata to accelerate sales with Oracle Partners Solutions Created a testing environment to enable Oracle Partners and ISVs to test their own solutions and those of their customers on Oracle Exadata running on Oracle Enterprise Linux or Oracle Solaris Express to benchmark performance prior to migration Leveraged expertise on Oracle Exadata to offer Oracle Exadata training, migration, support seminars and to showcase live demonstrations for Oracle Partners Proved how Oracle Exadata’s pre-engineered systems, that come assembled, configured, and ready to run, reduce deployment time and cost, minimize risk, and help customers achieve the full performance potential immediately after go live Increased processing speeds 10-fold and with zero data loss for a telecommunications provider’s client-facing customer relationship management solution Achieved performance improvements of between 6 and 100 times faster for financial and utility company applications currently running on IBM, Microsoft, or SAP HANA platforms Showed how daily closure procedures carried out overnight by banks, insurance companies, and other financial institutions to analyze each day’s business, can typically be cut from around six hours to 20 minutes, some 18 times faster, when running on Oracle Exadata Simulated concurrent back-ups while running applications under normal working conditions to prove that Oracle Exadata-based solutions can be backed up during business hours without causing bottlenecks or impacting the end-user experience Demonstrated that Oracle Exadata’s built-in analytics, data mining and OLTP capabilities make it the highest-performance, lowest-cost choice for large data warehousing operations Showed how Oracle Exadata’s columnar compression and intelligent storage architecture allows 10 times more data to be stored than on competitor platforms Demonstrated how Oracle Exadata cuts hardware requirements significantly by consolidating workloads on to fewer servers which delivers greater power efficiency and lower operating costs that competing systems from IBM and other manufacturers Proved to ISVs that migrating solutions to Oracle Exadata’s preconfigured, pre-integrated hardware and software can be completed rapidly, at low cost, and with minimal business disruption Demonstrated how storage servers, database servers, and network switches can be added incrementally and inexpensively to the Oracle Exadata platform to support business expansion On track to grow revenues by 10% in year one and by 15% annually thereafter through increased business generated from Oracle Partners and ISVs

    Read the article

  • Using the ASP.NET Membership API with SQL Server / SQL Azure: The new &ldquo;System.Web.Providers&rdquo; namespace

    - by Harish Ranganathan
    The Membership API came in .NET 2.0 and was a huge enhancement in building web applications with users, managing roles, permissions etc.,  The Membership API by default uses SQL Express and until Visual Studio 2008, it was available only through the ASP.NET Configuration manager screen (Website – ASP.NET Configuration) or (Project – ASP.NET Configuration) and for every application, one has to manually visit this place to start using the Security and other settings.  Upon doing that the default SQL Express database aspnet.mdf is created to store all the user profiles. Starting Visual Studio 2010 and .NET 4.0, the Default Website template includes the Membership API controls as a part of the page i.e. When you create a “File – New – ASP.NET Web Application” or an “ASP.NET MVC Application”, by default the Login/Register controls are enabled in the MasterPage and they are termed under “ApplicationServices” setting in the web.config file with connection string pointed to the SQL Express database. In fact, when you run the default website and click on “Logon” –> “Register”, and enter the details for registration and click “Register”, that is the time the aspnet.mdf file is created with the tables for Users, Roles, UsersInRoles, Profile etc., Now, this uses the default SQL Express database within the App_Data folder.  If you want to move your Membership information to some other database such as SQL Server, SQL CE or SQL Azure, you need to manually run the aspnet_regsql command and specify the destination database name. This would create all the Tables, Procedures and Views required to handle the Membership information.  Thereafter you can change the connection string for “ApplicationServices” to point to the database where you had run all the scripts. Now, enter “System.Web.Providers” Alpha. This is available as a part of the NuGet package library.  Scott Hanselman has a neat post describing the steps required to get it up and running as well as doing the basic changes  at http://www.hanselman.com/blog/IntroducingSystemWebProvidersASPNETUniversalProvidersForSessionMembershipRolesAndUserProfileOnSQLCompactAndSQLAzure.aspx Pretty much, it covers what the new System.Web.Providers do. One thing I wanted to clarify is that, the new “System.Web.Providers” add a lot of new settings which are also marked as the defaults, in the web.config.  Even now, they use SQL Express as the default database.  But, if you change the connection string for “DefaultConnection” under connectionStrings to point to your SQL Server or SQL Azure, Membership API would now be able to create all the tables, procedures and views at the destination specified (i.e. SQL Server or SQL Azure). In my case, I modified the DefaultConneciton to point to my SQL Azure database.  Next, I hit F5 to run the application.  The default view loads.  I clicked on “LogOn” and then “Register” since I knew there are no tables/users as of then.  One thing to note is that, I had put “NewDB” as the database name in the connection string that points to SQL Azure.  NewDB wasn’t existing and I would assume it would be created before the tables/views/procedures for Membership are created. Once I clicked on the “Register” to register my first username, it took a while and then registered as well as logged in me in.  Also, I went to the SQL Azure Management Portal and verified that there exists “NewDB” which has just been created I could also connect to the SQL Azure database “NewDB” from Management Studio and found that the tables now don’t have the aspnet_ prefix.  The tables were simply Users, Roles, UsersInRoles, Profiles etc., So, with a few clicks and configuration change, I could actually set up the user base for my application on SQL Azure and even make the SessionState, Roles, Profiles being stored in SQL Azure database. The new System.Web.Proivders also required MARS (MultipleActiveResultSets=true) setting since it uses Entity Framework for the DAL operations.  Also, the “Project – ASP.NET Configuration” screen can be used to further create/manage users/roles etc., although the data is stored on the remote database. With that, a long pending request from the community to have the ability to configure and use remote databases for Application users management without having to run the scripts from SQL Express is fulfilled. Cheers !!!

    Read the article

  • Delegation of Solaris Zone Administration

    - by darrenm
    In Solaris 11 'Zone Delegation' is a built in feature. The Zones system now uses finegrained RBAC authorisations to allow delegation of management of distinct zones, rather than all zones which is what the 'Zone Management' RBAC profile did in Solaris 10.The data for this can be stored with the Zone or you could also create RBAC profiles (that can even be stored in NIS or LDAP) for granting access to specific lists of Zones to administrators.For example lets say we have zones named zoneA through zoneF and we have three admins alice, bob, carl.  We want to grant a subset of the zone management to each of them.We could do that either by adding the admin resource to the appropriate zones via zonecfg(1M) or we could do something like this with RBAC data directly: First lets look at an example of storing the data with the zone. # zonecfg -z zoneA zonecfg:zoneA> add admin zonecfg:zoneA> set user=alice zonecfg:zoneA> set auths=manage zonecfg:zoneA> end zonecfg:zoneA> commit zonecfg:zoneA> exit Now lets look at the alternate method of storing this directly in the RBAC database, but we will show all our admins and zones for this example: # usermod -P +Zone Management -A +solaris.zone.manage/zoneA alice # usermod -A +solaris.zone.login/zoneB alice # usermod -P +Zone Management-A +solaris.zone.manage/zoneB bob # usermod -A +solaris.zone.manage/zoneC bob # usermod -P +Zone Management-A +solaris.zone.manage/zoneC carl # usermod -A +solaris.zone.manage/zoneD carl # usermod -A +solaris.zone.manage/zoneE carl # usermod -A +solaris.zone.manage/zoneF carl In the above alice can only manage zoneA, bob can manage zoneB and zoneC and carl can manage zoneC through zoneF.  The user alice can also login on the console to zoneB but she can't do the operations that require the solaris.zone.manage authorisation on it.Or if you have a large number of zones and/or admins or you just want to provide a layer of abstraction you can collect the authorisation lists into an RBAC profile and grant that to the admins, for example lets great an RBAC profile for the things that alice and carl can do. # profiles -p 'Zone Group 1' profiles:Zone Group 1> set desc="Zone Group 1" profiles:Zone Group 1> add profile="Zone Management" profiles:Zone Group 1> add auths=solaris.zone.manage/zoneA profiles:Zone Group 1> add auths=solaris.zone.login/zoneB profiles:Zone Group 1> commit profiles:Zone Group 1> exit # profiles -p 'Zone Group 3' profiles:Zone Group 1> set desc="Zone Group 3" profiles:Zone Group 1> add profile="Zone Management" profiles:Zone Group 1> add auths=solaris.zone.manage/zoneD profiles:Zone Group 1> add auths=solaris.zone.manage/zoneE profiles:Zone Group 1> add auths=solaris.zone.manage/zoneF profiles:Zone Group 1> commit profiles:Zone Group 1> exit Now instead of granting carl  and aliace the 'Zone Management' profile and the authorisations directly we can just give them the appropriate profile. # usermod -P +'Zone Group 3' carl # usermod -P +'Zone Group 1' alice If we wanted to store the profile data and the profiles granted to the users in LDAP just add '-S ldap' to the profiles and usermod commands. For a documentation overview see the description of the "admin" resource in zonecfg(1M), profiles(1) and usermod(1M)

    Read the article

< Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >