Search Results

Search found 4140 results on 166 pages for 'alias analysis'.

Page 48/166 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • ~/.profile does not run on startup

    - by pocoa
    I want to run some scripts at system startup, so in ~/.profile file, I've added: WORKSPACE="~/Development/workspace" alias workspace="cd $WORKSPACE" So I want this "workspace" alias to be available after the startup. Maybe it's not the right place to define these variables.

    Read the article

  • MS project publishing to TFS web portal display

    - by denis bastarache
    So, when we initially created our MPP schedule, I made use of indends / subordinates to break down the project by the various stages of the lifecycle, which is fine... no issues there... But now that I'm trying to publish this over to TFS display, it'll only pick up the actual "action items / sub-tasks" seeing as I have resource allocation specified. So for example I have an "Analysis" phase with a few items underneath, and "System Requirements" phase with the same items, so when I publish these to TFS, it won't display the "Parent" distinction between items, so both "Tasks" instances are being published in TFS under the exact same name... So, if I can't do this Automatically, I'll likely have edit each tasks with "Analysis - Item 1", "Analysis - item 2", "SRD - Item 1", "SRD - item 2"... is there a way to do this automatically, or will have to go the manual route??

    Read the article

  • Configure bash_profile for one single terminal emulator

    - by Hugo
    I'm using a new terminal emulator. Terminology is the E17 default terminal, and it have a great command, $ tyls with is a "graphical" $ ls I want to create an alias just for this terminal, because the command "tyls" don't make sense to konsole, rxvt or other terminals. I'm thinking in some kind of "if" in ~/.bash_profile to test if I'm on terminology and then run the following command: alias ls="tyls" But how can I test if I'm in terminology but not xterm? Can someone help me? Thanks!

    Read the article

  • Common filesystem for servers behind a rackspace load balancer

    - by thanos panousis
    Our PHP application consists of a single web server that will receive files from clients and perform a CPU-intensive analysis on them. Right now, analysis of a single user upload can take 3sec to conclude and take 100% CPU. This makes our system capacity amount to 1/3 requests per second. My team's requirement is to increase capacity without a lot of code reengineering. A possible solution would be to set up a load balancer in front of multiple servers running the same app, connecting to a common DB. The problem is that the analysis outputs files on disk. A load balancer would increase capacity, but then files won't be available between servers so consequent client requests may fail. We are hosted on Rackspace, is there a way to configure some sort of "common" storage for all servers, without having to rewrite our file persistance code? Current code relies on simple fopens etc. What are our options?

    Read the article

  • Serving protected files using Nginx's X-Accel-Redirect header

    - by andybak
    I'm trying to serve protected files using this directive in my nginx.conf: location /secure/ { internal; alias /home/ldr/webapps/nginx/app/secure/; } I'm passing in paths in the form: "/myfile.doc" and the file's path would be: /home/ldr/webapps/nginx/app/secure/myfile.doc I just get 404's when I access "http: //myserver/secure/myfile.doc" (space inserted after http to stop ServerFault converting it to a link) I've tried taking the trailing / off the location directive and that makes no difference. Two questions: How do I fix it! How can I debug problems like this myself? How can I get Nginx to report which path it's looking for? error.log shows nothing and access.log just tells me which url is being requested - this is the bit I already know! It's no fun trying things randomly without any feedback. Here's my entire nginx.conf: daemon off; worker_processes 2; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; server { listen 21534; server_name my.server.com; client_max_body_size 5m; location /media/ { alias /home/ldr/webapps/nginx/app/media/; } location / { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; fastcgi_pass unix:/home/ldr/webapps/nginx/app/myproject/django.sock; fastcgi_pass_header Authorization; fastcgi_hide_header X-Accel-Redirect; fastcgi_hide_header X-Sendfile; fastcgi_intercept_errors off; include fastcgi_params; } location /secure { internal; alias /home/ldr/webapps/nginx/app/secure/; } } } EDIT: I'm trying some of the suggestions here So I've tried: location /secure/ { internal; alias /home/ldr/webapps/nginx/app/; } both with and without the trailing slash on location. I've also tried moving this block before the "location /" directive. The page I linked to has ^~ after 'location' giving: location ^~ /secure/ { ...etc... Not sure what that signifies but it didn't work either!

    Read the article

  • SQL SERVER – Microsoft SQL Server Migration Assistant V6.0 Released

    - by Pinal Dave
    Every company makes a different decision about the database when they start, but as they move forward they mature and make the decision which is based on their experience and best interest of the organization. Similarly, quite a many organizations make different decisions on database, like Sybase, MySQL, Oracle or Access and as time passes by they learn that now they want to move to a different platform. Microsoft makes it easy for SQL Server professional by releasing various Migration Assistant tools. Last week, Microsoft released Microsoft SQL Server Migration Assistant v6.0. Here are different tools released earlier last week to migrate various product to SQL Server. Microsoft SQL Server Migration Assistant v6.0 for Sybase SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from Sybase Adaptive Server Enterprise (ASE) to SQL Server and Azure SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing. Microsoft SQL Server Migration Assistant v6.0 for MySQL SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from MySQL to SQL Server and Azure SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing. Microsoft SQL Server Migration Assistant v6.0 for Oracle SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from Oracle to SQL Server and Azure SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing. Microsoft SQL Server Migration Assistant v6.0 for Access SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from Access to SQL Server. SSMA for Access automates conversion of Microsoft Access database objects to SQL Server database objects, loads the objects into SQL Server and Azure SQL DB, and then migrates data from Microsoft Access to SQL Server and Azure SQL DB. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SQL Migration

    Read the article

  • Take Control of Workflow with Workflow Analyzer!

    - by user793553
    Take Control of Workflow with Workflow Analyzer! Immediate Analysis and Output of your EBS Workflow Environment The EBS Workflow Analyzer is a script that reviews the current Workflow Footprint, analyzes the configurations, environment, providing feedback, and recommendations on Best Practices and areas of concern. Go to Doc ID 1369938.1  for more details and script download with a short overview video on it. Proactive Benefits: Immediate Analysis and Output of Workflow Environment Identifies Aged Records Identifies Workflow Errors & Volumes Identifies looping Workflow items and stuck activities Identifies Workflow System Setup and configurations Identifies and Recommends Workflow Best Practices Easy To Add Tool for regular Workflow Maintenance Execute Analysis anytime to compare trending from past outputs The Workflow Analyzer presents key details in an easy to review graphical manner.   See the examples below. Workflow Runtime Data Table Gauge The Workflow Runtime Data Table Gauge will show critical (red), bad (yellow) and good (green) depending on the number of workflow items (WF_ITEMS).   Workflow Error Notifications Pie Chart A pie chart shows the workflow error notification types.   Workflow Runtime Table Footprint Bar Chart A pie chart shows the workflow error notification types and a bar chart shows the workflow runtime table footprint.   The analyzer also gives detailed listings of setups and configurations. As an example the workflow services are listed along with their status for review:   The analyzer draws attention to key details with yellow and red boxes highlighting areas of review:   You can extend on any query by reviewing the SQL Script and then running it on your own or making modifications for your own needs:     Find more details in these notes: Doc ID 1369938.1 Workflow Analyzer script for E-Business Suite Worklfow Monitoring and Maintenance Doc ID 1425053.1 How to run EBS Workflow Analyzer Tool as a Concurrent Request Or visit the My Oracle Support EBS - Core Workflow Community  

    Read the article

  • Linux to Solaris @ Morgan Stanley

    - by mgerdts
    I came across this blog entry and the accompanying presentation by Robert Milkoski about his experience switching from Linux to Oracle Solaris 11 for a distributed OpenAFS file serving environment at Morgan Stanley. If you are an IT manager, the presentation will show you: Running Solaris with a support contract can cost less than running Linux (even without a support contract) because of technical advantages of Solaris. IT departments can benefit from hiring computer scientists into Systems Programmer or similar roles.  Their computer science background should be nurtured so that they can continue to deliver value (savings and opportunity) to the business as technology advances. If you are a sysadmin, developer, or somewhere in between, the presentation will show you: A presentation that explains your technical analysis can be very influential. Learning and using the non-default options of an OS can make all the difference as to whether one OS is better suited than another.  For example, see the graphs on slides 3 - 5.  The ZFS default is to not use compression. When trying to convince those that hold the purse strings that your technical direction should be taken, the financial impact can be the part that closes the deal.  See slides 6, 9, and 10.  Sometimes reducing rack space requirements can be the biggest impact because it may stave off or completely eliminate the need for facilities growth. DTrace can be used to shine light on performance problems that may be suspected but not diagnosed.  It is quite likely that these problems have existed in OpenAFS for a decade or more.  DTrace made diagnosis possible. DTrace can be used to create performance analysis tools without modifying the source of software that is under analysis.  See slides 29 - 32. Microstate accounting, visible in the prstat output on slide 37 can be used to quickly draw focus to problem areas that affect CPU saturation.  Note that prstat without -m gives a time-decayed moving average that is not nearly as useful. Instruction level probes (slides 33 - 34) are a super-easy way to identify which part of a function is hot.

    Read the article

  • ASP.NET website deployment [on hold]

    - by Rei Brazilva
    I am getting my hands wet with ASP and I have been following the tutorials. I deployed the site and in Azure and it worked great. Today I started actually designing the site. And when I published, it looks as if it doesn't read any of the files I just updated, added, and modified. It works on my localhost, but not in the Azure. I thought when you publish, everything goes up, including the new files. I don't have enough reputation to add a picture so, you'll forgive me. SO, basically, how do I get my entire site uploaded? In case anyone does stop by, I was able to pull this out just recently: CA0058 Error Running Code Analysis CA0058 : The referenced assembly 'DotNetOpenAuth.AspNet, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246' could not be found. This assembly is required for analysis and was referenced by: C:\Users\lotusms\Desktop\LOTUS MARKETING\ASP.NET\WebsiteManager\WebsiteManager\bin\WebsiteManager.dll, C:\Users\lotusms\Desktop\LOTUS MARKETING\ASP.NET\WebsiteManager\packages\Microsoft.AspNet.WebPages.OAuth.2.0.20710.0\lib\net40\Microsoft.Web.WebPages.OAuth.dll. [Errors and Warnings] (Global) CA0001 Error Running Code Analysis CA0001 : The following error was encountered while reading module 'Microsoft.Web.WebPages.OAuth': Assembly reference cannot be resolved: DotNetOpenAuth.AspNet, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246. [Errors and Warnings] (Global) Could this have something to do with the problem?

    Read the article

  • JUnit Testing in Multithread Application

    - by e2bady
    This is a problem me and my team faces in almost all of the projects. Testing certain parts of the application with JUnit is not easy and you need to start early and to stick to it, but that's not the question I'm asking. The actual problem is that with n-Threads, locking, possible exceptions within the threads and shared objects the task of testing is not as simple as testing the class, but testing them under endless possible situations within threading. To be more precise, let me tell you about the design of one of our applications: When a user makes a request several threads are started that each analyse a part of the data to complete the analysis, these threads run a certain time depending on the size of the chunk of data (which are endless and of uncertain quality) to analyse, or they may fail if the data was insufficient/lacking quality. After each completed its analysis they call upon a handler which decides after each thread terminates if the collected analysis-data is sufficient to deliver an answer to the request. All of these analysers share certain parts of the applications (some parts because the instances are very big and only a certain number can be loaded into memory and those instances are reusable, some parts because they have a standing connection, where connecting takes time, ex.gr. sql connections) so locking is very common (done with reentrant-locks). While the applications runs very efficient and fast, it's not very easy to test it under real-world conditions. What we do right now is test each class and it's predefined conditions, but there are no automated tests for interlocking and synchronization, which in my opionion is not very good for quality insurances. Given this example how would you handle testing the threading, interlocking and synchronization?

    Read the article

  • C++ Numerical Recipes &ndash; A New Adventure!

    - by JoshReuben
    I am about to embark on a great journey – over the next 6 weeks I plan to read through C++ Numerical Recipes 3rd edition http://amzn.to/YtdpkS I'll be reading this with an eye to C++ AMP, thinking about implementing the suitable subset (non-recursive, additive, commutative) to run on the GPU. APIs supporting HPC, GPGPU or MapReduce are all useful – providing you have the ability to choose the correct algorithm to leverage on them. I really think this is the most fascinating area of programming – a lot more exciting than LOB CRUD !!! When you think about it , everything is a function – we categorize & we extrapolate. As abstractions get higher & less leaky, sooner or later information systems programming will become a non-programmer task – you will be using WYSIWYG designers to build: GUIs MVVM service mapping & virtualization workflows ORM Entity relations In the data source SharePoint / LightSwitch are not there yet, but every iteration gets closer. For information workers, managed code is a race to the bottom. As MS futures are a bit shaky right now, the provider agnostic nature & higher barriers of entry of both C++ & Numerical Analysis seem like a rational choice to me. Its also fascinating – stepping outside the box. This is not the first time I've delved into numerical analysis. 6 months ago I read Numerical methods with Applications, which can be found for free online: http://nm.mathforcollege.com/ 2 years ago I learned the .NET Extreme Optimization library www.extremeoptimization.com – not bad 2.5 years ago I read Schaums Numerical Analysis book http://amzn.to/V5yuLI - not an easy read, as topics jump back & forth across chapters: 3 years ago I read Practical Numerical Methods with C# http://amzn.to/V5yCL9 (which is a toy learning language for this kind of stuff) I also read through AI a Modern Approach 3rd edition END to END http://amzn.to/V5yQSp - this took me a few years but was the most rewarding experience. I'll post progress updates – see you on the other side !

    Read the article

  • HTML Tidy in NetBeans IDE (Part 2)

    - by Geertjan
    This is what I was aiming for in the previous blog entry: What you can see above (especially if you click to enlarge it) is that I have HTML Tidy integrated into the NetBeans analyzer functionality, which is pluggable from 7.2 onwards. Well, if you set an implementation dependency on "Static Analysis Core", since it's not an official API yet. Also, the scopes of the analyzer functionality are not pluggable. That means you can 'only' set the analyzer's scope to one or more projects, one or more packages, or one or more files. Not one or more folders, which means you can't have a bunch off HTML files in a folder that you access via the Favorites window and then run the analyzer on that folder (or on multiple folders). Thus, to try out my new code, I had to put some HTML files into a package inside a Java application. Then I chose that package as the scope of the analyzer. Then I ran all the analyzers (i.e., standard NetBeans Java hints, FindBugs, as well as my HTML Tidy extension) on that package. The screenshot above is the result. Here's all the code for the above, which is a port of the Action code from the previous blog entry into a new Analyzer implementation: import java.io.IOException; import java.io.PrintWriter; import java.io.StringWriter; import java.util.ArrayList; import java.util.Collections; import java.util.List; import javax.swing.JComponent; import javax.swing.text.Document; import org.netbeans.api.fileinfo.NonRecursiveFolder; import org.netbeans.modules.analysis.spi.Analyzer; import org.netbeans.modules.analysis.spi.Analyzer.AnalyzerFactory; import org.netbeans.modules.analysis.spi.Analyzer.Context; import org.netbeans.modules.analysis.spi.Analyzer.CustomizerProvider; import org.netbeans.modules.analysis.spi.Analyzer.WarningDescription; import org.netbeans.spi.editor.hints.ErrorDescription; import org.netbeans.spi.editor.hints.ErrorDescriptionFactory; import org.netbeans.spi.editor.hints.Severity; import org.openide.cookies.EditorCookie; import org.openide.filesystems.FileObject; import org.openide.loaders.DataObject; import org.openide.util.Exceptions; import org.openide.util.lookup.ServiceProvider; import org.w3c.tidy.Tidy; public class TidyAnalyzer implements Analyzer {     private final Context ctx;     private TidyAnalyzer(Context cntxt) {         this.ctx = cntxt;     }     @Override     public Iterable<? extends ErrorDescription> analyze() {         List<ErrorDescription> result = new ArrayList<ErrorDescription>();         for (NonRecursiveFolder sr : ctx.getScope().getFolders()) {             FileObject folder = sr.getFolder();             for (FileObject fo : folder.getChildren()) {                 for (ErrorDescription ed : doRunHTMLTidy(fo)) {                     if (fo.getMIMEType().equals("text/html")) {                         result.add(ed);                     }                 }             }         }         return result;     }     private List<ErrorDescription> doRunHTMLTidy(FileObject sr) {         final List<ErrorDescription> result = new ArrayList<ErrorDescription>();         Tidy tidy = new Tidy();         StringWriter stringWriter = new StringWriter();         PrintWriter errorWriter = new PrintWriter(stringWriter);         tidy.setErrout(errorWriter);         try {             Document doc = DataObject.find(sr).getLookup().lookup(EditorCookie.class).openDocument();             tidy.parse(sr.getInputStream(), System.out);             String[] split = stringWriter.toString().split("\n");             for (String string : split) {                 //Bit of ugly string parsing coming up:                 if (string.startsWith("line")) {                     final int end = string.indexOf(" c");                     int lineNumber = Integer.parseInt(string.substring(0, end).replace("line ", ""));                     string = string.substring(string.indexOf(": ")).replace(":", "");                     result.add(ErrorDescriptionFactory.createErrorDescription(                             Severity.WARNING,                             string,                             doc,                             lineNumber));                 }             }         } catch (IOException ex) {             Exceptions.printStackTrace(ex);         }         return result;     }     @Override     public boolean cancel() {         return true;     }     @ServiceProvider(service = AnalyzerFactory.class)     public static final class MyAnalyzerFactory extends AnalyzerFactory {         public MyAnalyzerFactory() {             super("htmltidy", "HTML Tidy", "org/jtidy/format_misc.gif");         }         public Iterable<? extends WarningDescription> getWarnings() {             return Collections.EMPTY_LIST;         }         @Override         public <D, C extends JComponent> CustomizerProvider<D, C> getCustomizerProvider() {             return null;         }         @Override         public Analyzer createAnalyzer(Context cntxt) {             return new TidyAnalyzer(cntxt);         }     } } The above only works on packages, not on projects and not on individual files.

    Read the article

  • Project Showcase: SaaS Web Apps Hits a Home Run with New SCMS Database

    - by Webgui
    We love seeing projects from start to finish, and we’re happy to share the latest example with you. Who: SaaS Web Apps – they use Software as a Service to create web applications that look and feel like desktop applications. What: SaaS Web Apps needed to build a Sports Contract Management System (SCMS) for one of its customers, Premier Stinson Sports. Why: The SCMS database is used for collecting, analyzing and recording college coach and athletic directors’ employment and contract data. The Challenge: Premier Stinson Sports works with a number of partners, each with its own needs and unique requirements. For example, USA Today uses the system to provide cutting edge news analysis while The National Sports Law Institute of Marquette University Law School uses it to for the latest sports contract data and student analysis. In addition, the system needed to be secure due to the sensitivity of the data; it was essential that the user security and permissions be easily configurable. As always, performance was a key factor, especially with the intense reporting and analytical capabilities for this project. Because of this, most of the processing had to be done on a dedicated server but the project called for the richness and responsiveness of a desktop application. The Solution: To execute the project, SaaS Web Apps used APS.Net-based Visual WebGui from Gizmox, combined with SQL Server 2008 and SQL Reporting Services. This combination resulted in a quick deployment for SaaS Web Apps’ customers. The Result: The completed project gave each partner the scalability and availability of a web application with the performance and security of a desktop application. As an example, USA Today pulls data from this database to give readers the latest sports stats – Salary analysis of 2010 Football Bowl Subdivision Coaches. And here’s a screenshot of the database itself. Great work, SaaS Web Apps!

    Read the article

  • Resharper and Custom Live Template Not Working

    - by Bryce Fischer
    Working with Unity, I thought it would be a good idea to create some LiveTemplates to help out with creating configuration entries. For example, I want to create some typeAlias elements in a file called "unity.config": <typeAlias alias="QueryService" type="type,QueryAssembly"/> So, I created a live template: Shortcut: typeAlias Available "in all files" <typeAlias alias="$ALIAS$" type="$TYPE$,$ASSEMBLY$"/> the unity.config file is an XML file. I put the cursor in an empty spot and type "typeAlias" and then the tab key. Nothing happens. Any ideas?

    Read the article

  • java: decoding URI query string

    - by Jason S
    I need to decode a URI that contains a query string; expected input/output behavior is something like the following: abstract class URIParser { /** example input: * something?alias=pos&FirstName=Foo+A%26B%3DC&LastName=Bar */ URIParser(String input) { ... } /** should return "something" for the example input */ public String getPath(); /** should return a map * {alias: "pos", FirstName: "Foo+A&B=C", LastName: "Bar"} */ public Map<String,String> getQuery(); } I've tried using java.net.URI, but it seems to decode the query string so in the above example I'm left with "alias=pos&FirstName=Foo+A&B=C&LastName=Bar" so there is ambiguity whether a "&" is a query separator or is a character in a query component. edit: just tried URI.getRawQuery() and it doesn't do the encoding, so I can split the query string with a "&", but then what do I do? Any suggestions?

    Read the article

  • [Newbie] How to join mysql tables

    - by Ivan
    I've an old table like this: user> id | name | address | comments And now I've to create an "alias" table to allow some users to have an alias name for some reasons. I've created a new table 'user_alias' like this: user_alias> name | user But now I have a problem due my poor SQL level... How to join both tables to generate something like this: 1 | my_name | my_address | my_comments 1 | my_alias | my_address | my_comments 2 | other_name | other_address | other_comments I mean, I want to make a "SELECT..." query that returns in the same format as the "user" table ALL users and ALL alias.. Something like this: SELECT user.* FROM user LEFT JOIN user_alias ON `user`=`id` but it doesn't work for me..

    Read the article

  • nHibernate Criteria API Projections

    - by Craig
    I have an entity that is like this public class Customer { public Customer() { Addresses = new List<Address>(); } public int CustomerId { get; set; } public string Name { get; set; } public IList<Address> Addresses { get; set; } } And I am trying to query it using the Criteria API like this. ICriteria query = m_CustomerRepository.Query() .CreateAlias("Address", "a", NHibernate.SqlCommand.JoinType.LeftOuterJoin); var result = query .SetProjection(Projections.Distinct( Projections.ProjectionList() .Add(Projections.Alias(Projections.Property("CustomerId"), "CustomerId")) .Add(Projections.Alias(Projections.Property("Name"), "Name")) .Add(Projections.Alias(Projections.Property("Addresses"), "Addresses")) )) .SetResultTransformer(new AliasToBeanResultTransformer(typeof(Customer))) .List<Customer>() as List<Customer>; When I run this query the Addresses property of the Customer object is null. Is there anyway to add a projection for this List property?

    Read the article

  • How to join mysql tables

    - by Ivan
    I've an old table like this: user> id | name | address | comments And now I've to create an "alias" table to allow some users to have an alias name for some reasons. I've created a new table 'user_alias' like this: user_alias> name | user But now I have a problem due my poor SQL level... How to join both tables to generate something like this: 1 | my_name | my_address | my_comments 1 | my_alias | my_address | my_comments 2 | other_name | other_address | other_comments I mean, I want to make a "SELECT..." query that returns in the same format as the "user" table ALL users and ALL alias.. Something like this: SELECT user.* FROM user LEFT JOIN user_alias ON `user`=`id` but it doesn't work for me..

    Read the article

  • In Drupal, how to change the values passed to Pathauto?

    - by Vinicius Pinto
    I have Pathauto configured to generate an alias based on the title of a node, for a specific content type. The problem is that I want to make small changes in this title before Pathauto uses it to generate the alias. The first comment in this post suggests the use of hook_token_values, but I couldn't really understand how to use it, even after reading the docs. In my tests, when I implement this hook, the alias generated is always "array", which means I'm missing something. Any help? Thanks.

    Read the article

  • Using perl's Regexp::Grammars, how do I make a capture dependent on $MATCH?

    - by Evan Carroll
    I've got a token like such: <delim2=((?{ $MATCH{delim} }))> and what I want to happen is for delim2 to capture and be set to the value of delim. When I run this, delim2 is set, but the capture is never done. I think this is an error in my reasoning: I'm trying to chain this form: <ALIAS= ( PATTERN )> Match pattern, save match in $MATCH{ALIAS} and this form: (?{ MATCH{delim} }) into something like this <ALIAS= ( (?{MATCH{delim}) )> Matches the value of $MATCH{delim} save to $MATCH{delim2} but this simply doesn't seem valid. I can verify my original token works <delim2=((?{ die $MATCH{delim} }))> will die with the value, and, if I hard code it, I get the right capture and everything works <delim2=(')>? So how do I go about achieving sane results, while having a dynamic pattern?

    Read the article

  • Why did the C# designers attach three different meanings to the 'using' keyword?

    - by gWiz
    The using keyword has three disparate meanings: type/namespace aliasing namespace import syntactic sugar for ensuring Dispose is called The documentation calls the first two definitions directives (which I'm guessing means they are preprocessing in nature), while the last is a statement. Regardless of the fact that they are distinguished by their syntaxes, why would the language developers complicate the semantics of the keyword by attaching three different meanings to it? For example, (disclaimer: off the top of my head, there may certainly be better examples) why not add keywords like alias and import? Technical, theoretical, or historical reasons? Keyword quota? ;-) Contrived sample: import System.Timers; alias LiteTimer=System.Threading.Timer; alias WinForms=System.Windows.Forms; public class Sample { public void Action { var elapsed = false; using(var t = new LiteTimer.Timer(_ => elapsed = true) { while (!elapsed) CallSomeFinickyApi(); } } } "Using" is such a vague word.

    Read the article

  • Keystore and Aliases - is there a use to multiple aliases?

    - by Steve H
    When exporting a signed Android application using Eclipse, is there a purpose to using multiple aliases? According to the official guide about signing, it's recommended that you sign all applications with the same certificate to allow your applications to share data, code and be updated in modular fashion. Assuming that "alias", "key" and "certificate" are essentially interchangeable in this context, is there a reason why someone would want to use different aliases for all their applications? The only reason I can think of is that it adds more security to your applications, in the sense that a compromised key/password doesn't compromise everything. Are there other reasons? Also, is the generated key dependent on the name of the alias? In other words, if you change the name of the alias but not the password, would the generated certificate be different? Thanks.

    Read the article

  • Personal Project - Next practical language/tech to learn

    - by Paul Nathan
    I'm working on a personal project doing some finance analysis. It's a totally new field for me, and I'm really having fun with it so far, plus working in the high-level language arena is a great break from my embedded systems daytime work. I have a MySQL backend on a non-local server with a pile of stock data. My task now is to do some analysis of the stocks and produce something approximating a useful result. There are a couple technical difficulties. (1) I have a lot of records. To be precise, I believe I'm near 100K records right now, and this number grows by 6.1K each weekday. I need to create a way to rummage through these fields and do data analysis - based on a given computation, go look at this other set. Fine and dandy, nothing too outre. But this means I could really use a straightforward API for talking to MySQL. (2) Ideally, it runs on OS X 10.4.11. No Windows/Linux machine at home. (3) I can use PHP, C++, Perl, etc. I even have an R installation. I'm pretty flexible with stuff, so long as it runs on OS X. (Lots of options here, pick water, H20, or dihydrogen monoxide ;-) ) (4)Lack of hassle. While I like clever and fun ways of doing things, I'm trying to get some analysis done, not spend ten hours doing installation work and scratching my head figuring out a theoretical syntax question needed to spout out "hello world". What's the question? I'd like to dig into something different than my usual PHP/C++/C toolset. I'm looking for recommendations for languages/technologies that will assist me and meet the above requirements. In particular, I've heard a lot of buzz about F# and Python on SO. I've used CLISP for small problems before, and kinda liked it. I'm seeking opinions about those in particular. edit:since I rent the DB server and have a limited amount of CPU time online, I'm trying to do the analysis on a local machine.

    Read the article

  • SQL SERVER – List of All the Samples Database Available to Download for FREE

    - by Pinal Dave
    It is pretty much very common to have a sample database for any database product. Different companies keep on improving their product and keep on coming up with innovation in their product. To demonstrate the capability of their new enhancements they need the sample database. Microsoft have various sample database available for free download for their SQL Server Product. I have collected them here in a single blog post. Download an AdventureWorks Database The AdventureWorks OLTP database supports standard online transaction processing scenarios for a fictitious bicycle manufacturer (Adventure Works Cycles). Scenarios include Manufacturing, Sales, Purchasing, Product Management, Contact Management, and Human Resources. Coconut Dal Coconut Dal is a lightweight data access layer, for use in projects where the Entity Framework cannot be used or Microsoft’s Enterprise Library Data Block is unsuitable. Anyone who is handwriting ADO.NET should use a library instead and Coconut Dal might be the answer.  DataBooster – Extension to ADO.NET Data Provider The dbParallel DataBooster library is a high-performance extension to ADO.NET Data Provider, includes two aspects: 1) A slimmed down API encapsulation which simplified the most common data access operations (DbConnection -> DbCommand -> DbParameter -> DbDataReader) into a single class DbAccess, to help application with a clean DAL, avoid over-packing and redundant-copy of data transfer. 2) A booster for writing mass data onto database. Base on a rational utilization of database concurrency and a effective utilization of network bandwidth. Tabular AMO 2012 The sample is made of two project parts. The first part is a library of functions to manage tabular models -AMO2Tabular V2-. The second part is a sample to build a tabular model -AdventureWorks Tabular AMO 2012- using the AMO2Tabular library; the created model is similar to the ‘AdventureWorks Tabular Model 2012. SQL Server Analysis Services Product Samples SQL Server Analysis Services provides, a unified and integrated view of all your business data as the foundation for all of your traditional reporting, online analytical processing (OLAP) analysis, Key Performance Indicator (KPI) scorecards, and data mining. Analysis Services Samples for SQL Server 2008 R2 This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples you will also need to download the AdventureWorks family of databases. SQL Server Reporting Services Product Samples This project contains Reporting Services samples released with Microsoft SQL Server product. These samples are in the following five categories: Application Samples, Extension Samples, Model Samples, Report Samples, and Script Samples. If you are interested in contributing Reporting Services samples, please let us know by posting in the developers’ forum. Reporting Services Samples for SQL Server 2008 R2 This release is dedicated to the samples that ship for Microsoft SQL Server 2008 R2 PCU1. For many of these samples you will also need to download the AdventureWorks family of databases. SQL Server Integration Services Product Samples This project contains Integration Services samples released with Microsoft SQL Server product. These samples are in the following two categories: Package Samples and Programming Samples. If you are interested in contributing Integration Services samples, please let us know by posting in the developers’ forum. Integration Services Samples for SQL Server 2008 R2 This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples you will also need to download the AdventureWorks family of databases. Windows Azure SQL Reporting Admin Sample The SQLReportingAdmin sample for Windows Azure SQL Reporting demonstrates the usage of SQL Reporting APIs, and manages (add/update/delete) permissions of SQL Reporting users. Windows Azure SQL Reporting ReportViewer-SOAP API usage sample These sample projects demonstrate how to embed a Microsoft ReportViewer control that points to reports hosted on SQL Reporting report servers and how to use SQL Reporting SOAP APIs in your Windows Azure Web application. Enterprise Library 5.0 – Integration Pack for Windows Azure This NuGet package contains a zip file with the source code for the Enterprise Library Integration Pack for Windows Azure.  Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL Sample Database

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >