Search Results

Search found 28693 results on 1148 pages for 'oracle advanced security'.

Page 158/1148 | < Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >

  • HTML/JavaScript compation for security.

    - by BCS
    I just ran across this point that references a security vulnerability in Web Apps that depends on looking at the size of encrypted web pages to deduce what the uses is doing. The simplest solution to this I can think of would be to use a tool to minify all static content so that (after encryption) only a small number of result sizes exist so as to minimize the information available to an eavesdropper. Are there any tools for doing this?

    Read the article

  • Security Suggestions

    - by Kumar
    I am currently working on an ASP.NET 3.5 and C# web application which deals with users secure information like credit card numbers. What are some of the security measures which I need to take from an application development stand point so that I can sleep peacefully at night :)

    Read the article

  • Spring Security User

    - by DD
    What is best practise in Spring when creating a new user with custom attributes...to extend org.springframework.security.core.userdetails.User or to create the User in the UserDetailsService (this is the approach taken in the IceFaces tutorial). public UserDetails loadUserByUsername(String username) throws UsernameNotFoundException, DataAccessException { AppUser user = userDAO.findUser(username); if (user == null) throw new UsernameNotFoundException("User not found: " + username); else { return makeUser(user); } } private User makeUser(AppUser user) { return new User(user.getLogin(), user .getPassword(), true, true, true, true, makeGrantedAuthorities(user)); }

    Read the article

  • Using web.config directory security and extensionless urls

    - by Matt Brailsford
    Hi Guys, I'd like to use the built in directory security features built into the web.config to restrict access to child pages of a parent page. My structure is as follows: Members Members/News Members/Press Members/Movies Users should be able to have access to the members parent page, but not child pages. My problem is, because I am using extensionless URLs, the web.config thinks this is a directory and so access is blocked. Is there a way to say only restrict access for sub pages?

    Read the article

  • Javascript reference external script file - security implications

    - by rkrauter
    Hi, If I have a reference to an external third party JavaScript file on my website, what are the security implications? Can the JavaScript file be used to steal cookies? One example of this is the Google Analytics JavaScript reference file. Could the third party technically steal cookies or any other sensitive information from my logged on users (XSS)? The whole cross domain scripting has me confused sometimes. Thanks!

    Read the article

  • Security-Active Application in background-Does it store image of current screen

    - by user1509593
    Is this a probable security flaw. A user in public (lets say Starbucks) tries to log in to iOS application. He enters user id and password [Password is hidden using xxxxxxxx (not exposed)] and a call comes in or he presses home and the application goes to background. a) Does iOS store an image of current screen b) A malicious hacker with intent takes control of the device. Can he read the password ? Do we have to clear out sensitive information while going to background

    Read the article

  • File System and security (PHP)

    - by Felicita
    Consider a simple file upload system written in php. Customer has access only in admin panel. (Not FTP). He may change folder option from 707 to 755 for security issue. How can do this? Can we do this from upload script ? If yes is this a secure application?

    Read the article

  • Thread pool stack security issue

    - by elmatador
    In a naive implementation of a thread pool, can a piece of code that is being executed read the data left by some previous code on the stack (if it was running on the same thread instance)? Also, are there any other inherent security issues connected to thread pools?

    Read the article

  • Security Resources Defining, Static/Dynamic

    - by mmontalvo
    I am implementing a simple(hopefully) security manager within an application. Is it better to have predefined(static) roles, custom roles(dynamic)? I am leaning more towards dynamic groups or roles only because then it would not require a redeploy to update the system. Also, what would be the best approach to define resources in general? The application has a database that can hold either the static or dynamic values.

    Read the article

  • spring-security and jsf

    - by Mike
    Hi! i am developing in JSF a Spring Security application. the login form is fine. however, when i try to retrieve the authentication object, in future code, i always get the authentication pricipal as anonymous. i try to fetch is like this: Authentication auth = SecurityContextHolder.getContext().getAuthentication();

    Read the article

  • What is New in ASP.NET 4.0 Code Access Security

    - by HosamKamel
    ASP.NET Code Access Security (CAS) is a feature that helps protect server applications on hosting multiple Web sites, ASP.NET lets you assign a configurable trust level that corresponds to a predefined set of permissions. ASP.NET has predefined ASP.NET Trust Levels and Policy Files that you can assign to applications, you also can assign custom trust level and policy files. Most web hosting companies run ASP.NET applications in Medium Trust to prevent that one website affect or harm another site etc. As .NET Framework's Code Access Security model has evolved, ASP.NET 4.0 Code Access Security also has introduced several changes and improvements.   A Full post addresses the new changes in ASP.NET 4.0 is published at Asp.Net QA Team Here http://weblogs.asp.net/asptest/archive/2010/04/23/what-is-new-in-asp-net-4-0-code-access-security.aspx

    Read the article

  • Advanced Data Source Engine coming to Telerik Reporting Q1 2010

    This is the final blog post from the pre-release series. In it we are going to share with you some of the updates coming to our reporting solution in Q1 2010. A new Declarative Data Source Engine will be added to Telerik Reporting, that will allow full control over data management, and deliver significant gains in rendering performance and memory consumption. Some of the engines new features will be: Data source parameters - those parameters will be used to limit data retrieved from the data source to just the data needed for the report. Data source parameters are processed on the data source side, however only queried data is fetched to the reporting engine, rather than the full data source. This leads to lower memory consumption, because data operations are performed on queried data only, rather than on all data. As a result, only the queried data needs to be stored in the memory vs. the whole dataset, which was the case with the old approach Support for stored procedures - they will assist in achieving a consistent implementation of logic across applications, and are especially practical for performing repetitive tasks. A stored procedure stores the SQL statements and logic, which can then be executed in different reports and/or applications. Stored Procedures will not only save development time, but they will also improve performance, because each stored procedure is compiled on the data base server once, and then is reutilized. In Telerik Reporting, the stored procedure will also be parameterized, where elements of the SQL statement will be bound to parameters. These parameterized SQL queries will be handled through the data source parameters, and are evaluated at run time. Using parameterized SQL queries will improve the performance and decrease the memory footprint of your application, because they will be applied directly on the database server and only the necessary data will be downloaded on the middle tier or client machine; Calculated fields through expressions - with the help of the new reporting engine you will be able to use field values in formulas to come up with a calculated field. A calculated field is a user defined field that is computed "on the fly" and does not exist in the data source, but can perform calculations using the data of the data source object it belongs to. Calculated fields are very handy for adding frequently used formulas to your reports; Improved performance and optimized in-memory OLAP engine - the new data source will come with several improvements in how aggregates are calculated, and memory is managed. As a result, you may experience between 30% (for simpler reports) and 400% (for calculation-intensive reports) in rendering performance, and about 50% decrease in memory consumption. Full design time support through wizards - Declarative data sources are a great advance and will save developers countless hours of coding. In Q1 2010, and true to Telerik Reportings essence, using the new data source engine and its features requires little to no coding, because we have extended most of the wizards to support the new functionality. The newly extended wizards are available in VS2005/VS2008/VS2010 design-time. More features will be revealed on the product's what's new page when the new version is officially released in a few days. Also make sure you attend the free webinar on Thursday, March 11th that will be dedicated to the updates in Telerik Reporting Q1 2010. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SQL SERVER – Advanced Data Quality Services with Melissa Data – Azure Data Market

    - by pinaldave
    There has been much fanfare over the new SQL Server 2012, and especially around its new companion product Data Quality Services (DQS). Among the many new features is the addition of this integrated knowledge-driven product that enables data stewards everywhere to profile, match, and cleanse data. In addition to the homegrown rules that data stewards can design and implement, there are also connectors to third party providers that are hosted in the Azure Datamarket marketplace.  In this review, I leverage SQL Server 2012 Data Quality Services, and proceed to subscribe to a third party data cleansing product through the Datamarket to showcase this unique capability. Crucial Questions For the purposes of the review, I used a database I had in an Excel spreadsheet with name and address information. Upon a cursory inspection, there are miscellaneous problems with these records; some addresses are missing ZIP codes, others missing a city, and some records are slightly misspelled or have unparsed suites. With DQS, I can easily add a knowledge base to help standardize my values, such as for state abbreviations. But how do I know that my address is correct? And if my address is not correct, what should it be corrected to? The answer lies in a third party knowledge base by the acknowledged USPS certified address accuracy experts at Melissa Data. Reference Data Services Within DQS there is a handy feature to actually add reference data from many different third-party Reference Data Services (RDS) vendors. DQS simplifies the processes of cleansing, standardizing, and enriching data through custom rules and through service providers from the Azure Datamarket. A quick jump over to the Datamarket site shows me that there are a handful of providers that offer data directly through Data Quality Services. Upon subscribing to these services, one can attach a DQS domain or composite domain (fields in a record) to a reference data service provider, and begin using it to cleanse, standardize, and enrich that data. Besides what I am looking for (address correction and enrichment), it is possible to subscribe to a host of other services including geocoding, IP address reference, phone checking and enrichment, as well as name parsing, standardization, and genderization.  These capabilities extend the data quality that DQS has natively by quite a bit. For my current address correction review, I needed to first sign up to a reference data provider on the Azure Data Market site. For this example, I used Melissa Data’s Address Check Service. They offer free one-month trials, so if you wish to follow along, or need to add address quality to your own data, I encourage you to sign up with them. Once I subscribed to the desired Reference Data Provider, I navigated my browser to the Account Keys within My Account to view the generated account key, which I then inserted into the DQS Client – Configuration under the Administration area. Step by Step to Guide That was all it took to hook in the subscribed provider -Melissa Data- directly to my DQS Client. The next step was for me to attach and map in my Reference Data from the newly acquired reference data provider, to a domain in my knowledge base. On the DQS Client home screen, I selected “New Knowledge Base” under Knowledge Base Management on the left-hand side of the home screen. Under New Knowledge Base, I typed a Name and description of my new knowledge base, then proceeded to the Domain Management screen. Here I established a series of domains (fields) and then linked them all together as a composite domain (record set). Using the Create Domain button, I created the following domains according to the fields in my incoming data: Name Address Suite City State Zip I added a Suite column in my domain because Melissa Data has the ability to return missing Suites based on last name or company. And that’s a great benefit of using these third party providers, as they have data that the data steward would not normally have access to. The bottom line is, with these third party data providers, I can actually improve my data. Next, I created a composite domain (fulladdress) and added the (field) domains into the composite domain. This essentially groups our address fields together in a record to facilitate the full address cleansing they perform. I then selected my newly created composite domain and under the Reference Data tab, added my third party reference data provider –Melissa Data’s Address Check- and mapped in each domain that I had to the provider’s Schema. Now that my composite domain has been married to the Reference Data service, I can take the newly published knowledge base and create a project to cleanse and enrich my data. My next task was to create a new Data Quality project, mapping in my data source and matching it to the appropriate domain column, and then kick off the verification process. It took just a few minutes with some progress indicators indicating that it was working. When the process concluded, there was a helpful set of tabs that place the response records into categories: suggested; new; invalid; corrected (automatically); and correct. Accepting the suggestions provided by  Melissa Data allowed me to clean up all the records and flag the invalid ones. It is very apparent that DQS makes address data quality simplistic for any IT professional. Final Note As I have shown, DQS makes data quality very easy. Within minutes I was able to set up a data cleansing and enrichment routine within my data quality project, and ensure that my address data was clean, verified, and standardized against real reference data. As reviewed here, it’s easy to see how both SQL Server 2012 and DQS work to take what used to require a highly skilled developer, and empower an average business or database person to consume external services and clean data. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology Tagged: DQS

    Read the article

< Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >