Canadian Central Office Code Assignment Guidelines For Medicaid

Abstract

This study estimates a fixed effects ordered logit model physician office visit billing using claims data from South Carolina Medicaid and the State Employees Health Plan. The results find code creep increasing expenditures on physician office visits at a rate of 2.2 percent annually for both programs, with no significant difference in the rate between the two. The models also indicate that physician billing patterns differ between the programs, with the Medicaid claims averaging 1.3 percent less per visit than comparable State Employees Health Plan claims.

Introduction

Many studies refer to code creep as a contributing factor to improper billing, but policymakers have few estimates of its magnitude to use for guidance. Despite the lack of studies estimating code creep and improper billing, the 2005 Deficit Reduction Act progressively increases funding for the Medicaid Integrity Program, reaching its maximum of $75 million in 2009. With few studies to guide policy, Medicaid agencies have little guidance on whether code creep is a problem they should target with the assistance of the 2005 Deficit Reduction Act. This article estimates an upper bound for code creep in physician office billing for the State Medicaid Program in South Carolina.

A formal definition of code creep is elusive, but Steinwald and Dummit (1989) summarized code creep as “…changes in hospital record keeping practices to increase case mix and reimbursement.” Code creep is also often referred to as upcoding and, in hospital billing, diagnosis-related group (DRG creep). Finally, not all temporal variation in coding falls under code creep. Changes over time in billing can also be attributable to true changes in case mix (sicker patients), improvements in coding (both in provider education and in degree of detail in codes or their definitions), and changes instituted by the payer (program reforms) (Carter, Newhouse, and Relles, 1990).

The code creep literature has focused primarily on hospital billing of DRGs, especially following Medicare's switch to the prospective payment system (PPS) in the 1980s. Results for these early studies proved mixed. Multiple studies did find evidence for DRG creep during the implementation of PPS (Steinwald and Dummit, 1989; Chulis, 1991; Hsia et al., 1988) with the estimates falling below 3 percent. Subsequent studies found no evidence of code creep that could not be attributed to true case mix changes and improved coding practices (Hsia et al., 1992; Carter, Newhouse, and Relles, 1990).

After the articles assessing the billing impact of the switch to PPS, academic interest in code creep became sporadic. Unlike the mixed results examining PPS, later studies produced repeated evidence indicating that code creep exists. Survey data have indicated that 44 percent of health care managers have received pressure from their senior managers to promote coding optimization, but 33 percent reported that their coding behavior varies depending on the payer (Lorence and Richards, 2002; Lorence and Spink, 2002). Other authors have examined specific diagnoses that provide strong incentives to code a higher complexity in that diagnosis family. Silverman and Skinner (2004) found extensive code creep for pneumonia across all hospitals, but the largest increase appeared in for-profit hospitals, hospitals converting to for-profit status, and hospitals where physicians have an equity stake. Similarly, Psaty et al. (1999) examined charts for patients diagnosed with heart failure and could find no documentation in 38 percent of the charts to support higher reimbursement diagnoses. Lastly, code creep has not been limited to U.S. hospitals, with German studies attributing 1 percent of all inpatient payments to code creep (Lungen and Lauterbach, 2000).

Few studies examine physician billing for office visits in the U.S. Two studies in Canada have found that code creep is not limited to hospitals and also occurs in Canadian physician offices (Nassiri and Rochaix, 2006; Chan, Anderson, and Theriault, 1998). Evidence of code creep for physician office billing in the U.S. remains indirect. Wynia et al. (2000) surveyed physicians and found that 39 percent of physicians reported manipulating reimbursement rules, with 54 percent indicating that they were manipulating their billing more frequently in 1998 than they did in 1993. Interestingly, fear of prosecution did not affect the billing decisions of physicians admitting to manipulating reimbursement rules. Lastly, Cromwell et al. (2006) cited code creep as one possible explanation why the physicians in their study dedicated up to 32 percent less time to patient visits than the visit times associated with the Medicare fee schedule.

This study expands on previous work in three ways. First, this study will be the first to examine code creep in Medicaid. Excluding those using survey methods, all code creep studies in the U.S. have examined Medicare data. Second, it will be the first to examine billing for the same providers across two payers by comparing physicians' Medicaid billing to their own billing in the South Carolina State Employees Health Plan. Lastly, this study will be the first to estimate the magnitude of code creep for physician office visit billing. Specifically, this study tests (1) whether physicians bill office visits at equal levels of complexity across the two State programs, (2) whether the billing behavior displays unexplained changes over time, and (3) estimates the rate of increase for physician billing.

Methodology

In State fee-for-service programs, physician prices are routinely set by a fixed price schedule or through negotiations with the payer. Although prices are fixed, physicians still have the power to choose the level of complexity or billing code for the visit. If probability of detection is low, profit maximizing physicians can be expected to choose higher reimbursement codes or upcode on the margin.

Tables 1 and ​2 present an overview of the physician's choice set when assigning a code to an office visit. When billing the visit for an established patient (a patient seen previously), the provider must choose from one of the five billing codes listed in Table 1. The American Medical Association (2004) establishes definitions for the codes, and an extensive literature explains and analyzes each in detail (Hill, 2001; King, Sharp, and Lipsky, 2001). For visits dominated by counseling or coordination of care, a physician may use the length of the visit assigned to the billing code. Otherwise, the provider bases the code assignment on the complexity of the visit (documenting the problem's history, examination, and the complexity of the medical decisionmaking). Established visits are most frequently billed by complexity, and in these cases, the visit must meet or exceed the criteria for two of the three complexity categories (history, examination, and medical complexity) listed in Table 1. Lastly, payers reimburse providers for each visit based on the reported complexity and the administratively set rates attached to that billing code.

Table 1

Physician Evaluation and Management Service Codes for Established Patients

Table 2

South Carolina's Reimbursement Rates for Physician Evaluation and Management Service Codes: 2001-2003

Table 2 lists the median reimbursements paid for office visits in South Carolina Medicaid and the State Employees Health Plan programs. These median reimbursements are calculated from the full population of all paid office visit claims and reflect payment adjustments for provider type (nurse practitioner, specialist, etc.). Over the 3 years in the study, reimbursement rates for the established patient visits remained flat for both plans. Reimbursement for the most common Medicaid code, 99213, in 2001 was $36 and in 2003 declined to $35 ($44 and $47 for the State Employees Health Plan). For the less common new patient and consultation codes, reimbursement rates remained flat in the State Employees Health Plan and declined for Medicaid. In 2001 and 2002, Medicaid utilized a separate rate schedule for specialists and paid nurse practitioners at a discount to the general practitioner rate.

The question of whether flat reimbursement rates influence providers' coding of complexity of office visits are examined here. Reimbursement rates influencing providers' coding choices contrasts with the accepted view that prices are exogenous for physicians (that providers accept prices as given). If a physician considered the probability of detection low, a substantial incentive exists for the provider to upcode or report visits of higher complexity. Although payment rates for individual codes changed little over the study period, a provider could obtain a 50/60-percent increase in their reimbursement for a visit by assigning a code one level higher than the true code for the visit.

Data

This study utilizes 2001–2003 health care claims from South Carolina Medicaid and the State Employees Health Plan to estimate a fixed effects ordered logit model of physician office visit billing. The initial data set began as the full population of all paid Medicaid and State Employees Health Plan physician office visits. The analysis excludes claims at locations other than the provider's office. Limited information on providers outside of South Carolina required the elimination of claims from any provider with an address outside of the State. Physicians providing less than 150 total fee-for-service visits to Medicaid and State Employees Health Plan patients over the 3-year period were dropped from the data. Due to the very large number of remaining claims, a random sample was drawn of 500 providers and for each provider, 800 Medicaid visits and 800 State Employees Health Plan visits, (1,600 visits total). The sample retained all Medicaid or State Employees Health Plan claims for physicians that provided less than 800 visits in that program. This sampling procedure produced a final dataset of 204,945 office visits for the 500 providers.

Provider identification proved difficult in some cases. Although every physician is assigned a unique provider identification number, many group practices file all claims under a single group identification number. Since groups share billing resources and behaviors, the model analyzes billing behavior at the group level. The Federal tax identification number (FTIN) filed with each claim allowed the linking of providers (or groups for multiphysician practices) across programs. Not all providers participated in both programs, and some physicians filed claims under separate FTINs for each program. The analysis controls for these physicians who do not participate in both programs or who could not be linked across programs.

Model Specification

The model combines three classes of office visits into a single visit complexity variable. Routine office visits fall under new patient visits (codes 99201-99205), established patient visits (codes 99211-99215), or consultations (codes 99241-99245), with each group broken into five codes representing lowest through highest complexity. The study considers five potential outcomes:

  • Y1 = Codes 99201, 99211, or 99241

  • Y2 = Codes 99202, 99212, or 99242

  • Y3 = Codes 99203, 99213, or 99243

  • Y4 = Codes 99204, 99214, or 99244

  • Y5 = Codes 99205, 99215, or 99245

With the ranked nature of the dependent variable, an ordered logit can estimate the probability of choosing outcome Yj,

Pr(Yji) = Pr(ki−1 < ΣjßjXju < ki)

(1)

where the probability that the estimated linear function of the independent variables plus a logistic distributed random error lies between the estimated cut-points, ki (Zavoina and McKelvey, 1975; Greene, 2003). Stata® Version 8 (StataCorp LP, 2003) provided a convenient estimator for the ordered logit models.

In equation (1), Xj represents a matrix of independent variables indicating patient demographics and provider characteristics. Although claims data provides a rich source of information of provider behavior, potential independent variables are limited to the fields common to the claims forms for both programs. Given this limitation, the model includes age, sex, marital status, and urban residence to control for patient demographics. A dummy variable identifies providers who can be matched on both lists of participating physicians to control for providers not participating in both programs and those that use separate FTINs when billing Medicaid and the State Employees Health Plan.

Because sicker patients will also produce higher billing codes, the model includes controls for the 15 most expensive conditions and the patient's number of diagnoses that year. The claims data uses International Classification of Diseases, Ninth Revision, Clinical Modification (Centers for Disease Control and Prevention, 2007) codes to classify diagnoses, so the Clinical Classifications Software developed by the Agency for Healthcare Research and Quality (2007) was used to collapse the more than 12,000 potential diagnosis codes into 260 clinically meaningful categories (Elixhauser, Steiner, and Palmer, 2006). From these 260 categories, the model includes dummy variables for the 15 most expensive medical conditions: heart disease, pulmonary conditions, mental disorders, cancer, hypertension, trauma, cerebrovascular disease, arthritis, diabetes, back problems, skin disorders, pneumonia, infectious disease, endocrine, and kidney (Druss et al., 2002; Thorpe, Florence, and Joski, 2004). Lastly, the model includes dummy variables indicating the number of separate conditions, out of the 260 clinical conditions software categories, reported for that patient in the year of the claim.

An array of program and year dummy variables tests the code creep and differential billing hypotheses. A Medicaid dummy flags all claims to Medicaid and tests whether physicians as a whole bill Medicaid differently than the State Employees Health Plan. Interactions between the Medicaid dummy and 2-year dummies test whether Medicaid versus the State Employees Health Plan relationship changes over time. Finally, dummies for 2002 and 2003 test whether physicians are billing increasingly higher codes every year. Table 3 presents the variable means and distribution of the dependent variable.

Table 3

Physician Office Visit Claims for the South Carolina Medicaid and State Employees Health Plan: 2001-2003

Results

The summary statistics in Table 3 indicate that physicians bill both Medicaid and the State Employees Health plan in a similar manner, despite serving very different demographic groups. In both programs, physicians code one-half of their visits (49 percent for Medicaid and 50 percent for the State Employees Health Plan) at complexity Level Three. The lowest and highest complexities are both uncommon, with only 6 percent billed at Level 1 and 3 percent at Level 5. The remaining visits fall almost equally across the remaining two categories with 24 percent billed at Level 2 and 18 percent at Level 4. Between the two programs, lower complexity visits were marginally more common in Medicaid while the State Employee Health Plan displayed more Level 4 and Level 5 visits.

For the independent variables, Medicaid patients tend to be younger and less likely to be married, but females make two-thirds of the visits in both programs and another two-thirds of visits are made by individuals living in urban areas. Providers that cannot be matched across both datasets are more likely to appear in the State Employees Health Plan, with 97 percent of visits in Medicaid being made to physicians on both lists compared with 85 percent in the State Employees Health Plan. Finally, the case-mix controls varied widely by the sample drawn and should not be used to infer prevalence of these conditions in the Medicaid and State Employees Health Plan populations.

Table 4 presents two sets of estimates for the ordered logit model with provider fixed effects. Comparing the estimates from the two models (excluding case-mix variables and including case-mix variables) reveals the contribution of a sicker population to billing of higher complexity visits. The provider fixed effects control for time-invariant physician characteristics, including specialty and physician practice patterns.

Table 4

Provider Fixed Effects Ordered Logit Estimates of Visit Complexity: 2001-2003

In both models, visit complexity increased with time (p=0.000). Including the case-mix controls produced only modest reductions in the coefficients for the year dummies. Medicaid visits were billed at lower complexities in both models (p=0.000). The positive coefficients for the Medicaid*Year dummies indicate that the difference between Medicaid and State Employees Health Plan billing decreased over the 3-year period, but the decline was not statistically significant. For the sample used in Table 4, providers participating in both programs billed higher complexity visits, but this result proved sample dependent. All other estimates were robust across repeated samples.

The ordered logit estimates (Table 4) indicate that office visit complexities billed to Medicaid and the State Employees Health Plan did increase over the 3-year period after controlling for case mix and time-invariant physician characteristics, but they reveal little about the magnitude of the increase. Table 5 presents the average predicted probabilities for each complexity, illustrating the effect of code creep on physician billing. For each visit in the data, the model predicts the probability of the physician assigning each complexity level. The simulated values represent the averages for these probabilities for each complexity level (Table 5). Only the values for the simulated variable change, with all other variables in the model retaining their original values.

Table 5

Predicted Visit Complexities Using the Fixed Effect Ordered Logit Coefficients: 2001-2003

Table 5 simulates two scenarios. In the first scenario, all visits are billed under the prevailing billing patterns in 2001, 2002, and 2003. In the second scenario, all visits are billed under the billing patterns representative of the State Employees Health Plan and then with Medicaid billing patterns. Again, all other variables in the model retain their original values. The table shows each scenario, first, based on the model excluding the case-mix control variables, and second, with the case-mix variables.

The scenarios show the complexity of the average visit gradually increasing over the study period. Over the 3 years, Levels 1 and 2 visits become progressively less frequent, with Level 1 declining from 5.9 percent to 4.6 percent and Level 2 decreasing from 25.7 to 22.1 percent. In contrast, visits coded at Levels 3-5 each become more common, with Level 3 increasing from 49.7 to 50.5 percent, Level 4 accelerating from 15.9 to 19.2 percent, and Level 5 increasing from 2.7 to 3.5 percent. Including the case-mix variables produces no appreciable difference on the predicted complexity, with the frequencies changing no more than one-tenth of 1 percent.

The significant difference between Medicaid and State Employees Health Plan billing manifests in the simulations, but the case-mix variables can account for some of the observed differences between the two programs. With the billing patterns typical of the State Employees Health Plan, 50.4 percent of all visits are coded at Level 3, compared with 49.7 percent in Medicaid. Adding in the case-mix controls narrows these differences across all coding options, with Level 3 State Employees Health Plan visits slipping to 50.3 percent, and Medicaid increasing to 49.9 percent. Similarly, Level 4 visits start higher in State Employees Health Plan, at 18.4 percent and Medicaid at 16.3 percent, but this difference declines to 17.9 and 16.9 percent after including the case-mix controls.

After controlling for case mix and physician characteristics, code creep increased the cost of the average visit by 2.2 percent annually over the study period (Table 5). The average costs collapse the billing distributions into a single number and are calculated by multiplying the percent of visits at each complexity by the 2003 Medicaid reimbursement rate for established patient office visits from Table 2. These Medicaid rates were used for both Medicaid and State Employees Health Plan visits and new and established patients, and consultation visits. With this conversion, the average visit in 2001 cost $35.87, increasing by 1.9 percent to $36.54 in 2002, and in 2003 increasing 2.7 percent making the average visit cost $37.53. Including the case-mix controls changes the cost of the average visit increased by no more than $0.04. Based on these average visits, the case-mix controls reduce the code creep estimated from 2.3 percent annually to 2.2 percent per year. Finally, comparing all Medicaid to all State Employees Health Plan visits reveals that physician claims for Medicaid visits averaged $0.48 or 1.3 percent lower than the average State Employees Health Plan visit. This difference is less than one-half of the $1.14 spread between the average Medicaid and State Employees Health Plan visit when the case-mix controls are excluded.

Discussion

The ordered logit estimates and their associated simulations indicate that code creep increased the payments for physician visits by 2.2 percent annually over the study period. Although the existence of code creep should be a concern for Medicaid agencies, only an estimate of the total cost of the issue can indicate whether code creep would prove a worthwhile program integrity target. In 2003, South Carolina's Department of Health and Human Services (2004) spent $73 million on physician office visits out of a total $244 million on all physician services. Excluding increases in utilization, Medicaid can expect code creep to inflate physician office expenditures by 2.2 percent per year or $1.6 million in 2004 and a total of $8.4 million over 2004-2008. It should be noted that these figures only consider physician office visits and exclude hospital-based expenditures. Additional research will be necessary to determine if billing by South Carolina physicians is representative of other States and to determine how code creep in physician office visits compares to other physician and hospital billing.

The key limitation to this study also highlights a difficulty program integrity offices face in addressing code creep. As Carter and colleagues (1990) highlighted, changes in billing can be attributed to true changes in case mix (sicker patients), improvements in coding (provider education), changes instituted by the payer (program reforms), and code creep. South Carolina Medicaid did not implement any program reforms during the study period, and the model includes case-mix variables to control for sicker patients. However, distinguishing code creep from legitimate improvements in coding attributable to provider education would require documentation audits of medical charts. Therefore, the 2.2 percent annual increase attributed to code creep in this article should be considered an upper bound because it was not possible to distinguish code creep from legitimate improvements in coding. Code creep's diffuse nature makes it a difficult problem to address. Expensive, and unpopular chart reviews are unlikely to produce sufficient recoveries from audited physicians, but well publicized audits may hold sufficient deterrent value to make enforcement cost effective.

Conclusions

This study found significant code creep in both South Carolina Medicaid and the South Carolina State Employees Health Plan. No difference in code creep was observed across the two programs, with code creep increasing expenditures on physician office visits at a rate of 2.2 percent annually. The models also indicate that physician billing patterns differ between the two, with the Medicaid claims averaging 1.3 percent less expensive per visit than comparable State Employees Health Plan claims. Controlling for case-mix produced little change in the code creep estimates, but did account for one-third of the difference between the two programs.

Footnotes

The author is with the Ohio State University. The research in this article was supported by funding from the Strom Thurmond Institute of Government and Public Affairs at Clemson University and Clemson University's College of Health, Education, and Human Development's Summer Research Program. The statements expressed in this article are those of the author and do not necessarily reflect the views or policies of the Ohio State University, Clemson University, or the Centers for Medicare & Medicaid Services (CMS).

Reprint Requests: Eric E. Seiber, Ph.D., The Ohio State University, Division of Health Services Management and Policy, College of Public Health, 468 Cunz Hall, 1841 Neil Ave., Columbus, OH 43210. E-mail: ude.uso.hpc@rebiese

References

  • Agency for Healthcare Research and Quality. Internet address: http://www.hcup-us.ahrq.gov/toolssoftware/ccs/ccs.jsp (Accessed 2007.) [PubMed]
  • American Medical Association. Current Procedural Terminology 2005. AMA Press; Chicago, IL.: 2004.
  • Carter GM, Newhouse JP, Relles DA. How Much Change in the Case Mix Index is DRG Creep? Journal of Health Economics. 1990 Jul;9(4):411–428.[PubMed]
  • Centers for Disease Control and Prevention. International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) Internet address: http://www.cdc.gov/nchs/about/otheract/icd9/abticd9.htm (Accessed 2007.)
  • Chan B, Anderson GM, Theriault ME. Fee Code Creep Among General Practitioners and Family Physicians in Ontario: Why Does the Ratio of Intermediate to Minor Assessments Keep Climbing? Canadian Medical Association Journal. 1998 Mar;158(6):749–754.[PMC free article][PubMed]
  • Chulis GS. Assessing Medicare's Prospective Payment System for Hospitals. Medical Care Review. 1991 Summer;48(2):167–206.[PubMed]
  • Cromwell J, Hoover S, McCall N, et al. Validating CPT Typical Times for Medicare Office Evaluation and Management (E/M) Services. Medical Care Research Review. 2006 Apr;63(2):236–255.[PubMed]
  • Druss BG, Marcus SC, Olfson M, et al. The Most Expensive Medical Conditions in America. Health Affairs. 2002 Jul-Aug;21(4):105–111.[PubMed]
  • Elixhauser A, Steiner C, Palmer L. Clinical Classifications Software (CCS) Agency for Healthcare Research and Quality; 2006.
  • Greene WH. Econometric Analysis. 5 ed. Prentice Hall; Upper Saddle River, NJ.: 2003.
  • Hill E. How to Get All the 99214s You Deserve. Family Practice Management. 2001 Oct;8(9):43–48.[PubMed]
  • Hsia DC, Ahern CA, Ritchie BP, et al. Medicare Reimbursement Accuracy Under the Prospective Payment System, 1985-1988. Journal of the American Medical Association. 1992 Aug 19;268(7):867–869.[PubMed]
  • Hsia DC, Krushat WM, Fagan AB, et al. Accuracy of Diagnostic Coding for Medicare Patients Under the Prospective-Payment System. New England Journal of Medicine. 1988 Feb;318(6):352–355.[PubMed]
  • King MS, Sharp L, Lipsky M. Accuracy of CPT Evaluation and Management Coding by Family Physicians. Journal of the American Board of Family Practitioners. 2001 May-Jun;14(3):184–192.[PubMed]
  • Lorence DP, Richards M. Variation in Coding Influence Across the USA—Risk and Reward in Reimbursement Optimization. Journal of Management in Medicine. 2002;16(6):422–435.[PubMed]
  • Lorence DP, Spink A. Regional Variation in Medical Systems Data: Influences on Upcoding. Journal of Medical Systems. 2002 Oct;26(5):369–381.[PubMed]
  • Lungen M, Lauterbach KW. Upcoding—A Risk for the Use of Diagnosis-Related Groups. Deutsche Medizinische Wochenschrift. 2000 Jul;125(28-29):852–856.[PubMed]
  • Nassiri A, Rochaix L. Revisiting Physicians' Financial Incentives in Quebec: A Panel System Approach. Health Economics. 2006 Jan;15(1):49–64.[PubMed]
  • Psaty BM, Boineau R, Lewis K, et al. The Potential Costs of Upcoding for Heart Failure in the United States. American Journal of Cardiology. 1999 Jul;84(1):108–109.[PubMed]
  • Silverman E, Skinner J. Medicare Upcoding and Hospital Ownership. Journal of Health Economics. 2004 Mar;23(2):369–389.[PubMed]
  • South Carolina Department of Health and Human Services. South Carolina Medicaid Annual Report for State Fiscal Year 2004. 2004.
  • StataCorp LP. Stata Statistical Software: Release 8. Stata® Press; College Station, TX.: 2003.
  • Steinwald B, Dummit L. Hospital Case-Mix Change Under Medicare: Sicker Patients or DRG Creep? Health Affairs. 1989 Summer;8(2):35–47.[PubMed]
  • Thorpe KE, Florence CS, Joski P. Which Medical Conditions Account for the Rise in Health Care Spending? Health Affairs. 2004 Jul-Dec;23(Supplement 2):437–445.[PubMed]
  • Wynia MK, Cummins DS, VanGeest JB, et al. Physician Manipulation of Reimbursement Rules for Patients: Between a Rock and a Hard Place. Journal of the American Medical Association. 2000 Apr 12;283(14):1858–1865.[PubMed]
  • Zavoina W, McKelvey RD. A Statistical Model for the Analysis of Ordinal Level Dependent Variables. Journal of Mathematical Sociology. 1975;4:103–120.

Articles from Health Care Financing Review are provided here courtesy of Centers for Medicare and Medicaid Services

Canadian Steering Committee on Numbering: Task Identification Forms

 

Active TIFS

Inactive TIFS

Tabled TIFS


Active TIFs

TIF 99 - Update CSCN Guidelines for Vancouver LPZ

TIF 100 - Update unassignable NXXs in the Canadian Non-Geographic Code Assignment Guideline

Inactive TIFs

TIF 5 - Canadian Assignment Guidelines for Geographic NXX Codes (CNA)

TIF 30 - NANP Exhaust (CNA)

TIF 34 - CSCN adoption of the NANP Numbering Resource Utilization Forecast (NRUF) Process

TIF 35 - Guidelines Maintenance Standing Committee

TIF 36 - Use of numbers from 310, 610, and 810 CO Codes

TIF 37- Wireless Enhanced 911 ESRDs using CO Code 511

TIF 41- Treatment of Active and Inactive NXXs Assigned to a Code Holder that Discontinues Providing Service in a Exchange

TIF 42- Canadian Central Office Code Assignment Emergency Notification Process

TIF 43 - Canadian NPA Relief Planning Guidelines maintenance

TIF 44 - Canadian Central Office Code (NXX) Assignment Guidelines Appendix re: Reserved & Held TNs

TIF 45 - Loading of NXX Codes in ILEC/WSP/CLEC and IXC Switches

TIF 46 - International Mobile Station Identity (IMSI) and System Identifier (SID) Codes Guidelines

TIF 47 - Suspension and Reactivation of NPA Relief Planning Activities

TIF 48 - Multiple Operating Company Number (Multi-OCN) Oddball CO Codes

TIF 49 - Review of the CSCN Adjunct to the CISC Administrative Guidelines

TIF 50 - Carrier Identification Codes (CICs) for Switchless Resellers

TIF 51 - Open NPA 710 in Canada

TIF 52 - Changes to section 3.7 of the Canadian Central Office Code (NXX) Assignment Guidelines to clarify the meanings of current and projected future home and neighbouring NPAs

TIF 53 - ENUM (Telephone Number Mapping)

TIF 54 - Definition of a Canadian Telephone Number

TIF 55 - COCA GL Changes re: transfer, reclamation & return of CO Codes

TIF 56 - CDMA Use of MNC - Canadian International Mobile Station Identity (IMSI) Assignment Guidelines

TIF 57 - Requirement for CNAC - Service User Agreement

TIF 58 - Development of Canadian NPA 600 Code Assignment Guideline

TIF 59 - Review of INC 800-855 Number Assignment Guidelines

TIF 60 - Review of INC NPA Allocation Plan and Assignment Guidelines

TIF 61 - Review of INC 555 NXX Assignment Guidelines

TIF 62 - Review of Canadian Adjunct to INC Personal Communications Services N00 NXX Code Assignment Guidelines

TIF 63 - Review of INC Vertical Service Code Assignment Guidelines

TIF 64 - COCA GL Section 4.3 Shared CO Codes and Impacts on TN Portability

TIF 65 - Authorized Representatives of Numbering Resource Applicants

TIF 66 - Modify CSCN Adjunct to CISC Administrative Guidelines

TIF 67 - CNA Administration of MIN Block Identifiers (MBI) in Canada

TIF 68 - Canadian NPA Relief Planning Guidelines

TIF 69 - Creation of a Legal Corporation for the purpose of Conducting the Canadian ENUM Trial

TIF 70 - Update Canadian Central Office Code Assignment Guidelines, Appendix E, Location Routing Number (LRN) Selection Criteria in accordance with Telecom Decision CRTC 2006-28

CNTF070A.doc - 67 KB - CSCN TIF 70 - Update Canadian Central Office Code Assignment Guidelines Location Routing Number (LRN) Selection Criteria in accordance with Telecom Decision CRTC 2006-28

TIF 71 - Review of COCA GL Forms; Parts 1, 3 and 4

TIF 72 - Modifications to CSCN process for GTT point code notification to allow WNP

TIF 73 - Modifications to the Canadian MIN Block Identifier (MBI) Assignment Guideline

TIF 74 - Review of INC Issue 534 re: Development of pANI Guidelines

TIF 75 - CSCN Guidelines Review

TIF 76 - Additional modifications to the Canadian MIN Block Identifier (MBI) Assignment Guideline

TIF 77 - N11 Implementation Notification

TIF 78 - Sint Maarten Application to Join the NANP

TIF 79 - CSCN - CISC Administration Review

TIF 80 - Exhaust of Canadian Emergency Service Routing Digit (ESRD) Resources (8 January 2009)

TIF 81 - Reservation of Future Canadian Geographic NPA Codes for relief of existing Canadian NPAs forecast to exhaust in next 25 year timeframe (8 May 2009)

TIF 82 - INC Issue 664: Review NPA Allocation Guidelines to consider NPA Sharing (20 October 2009)

TIF 83 - Explanation for NRUF Variances

TIF 84 - Review of Canadian NPA 600 NXX Code Assignment Guideline

TIF 85 - Address look up systems necessary for IP voice network interconnection dated March 5, 2012

TIF 86 - Non-Geographic 5XX-NXX Code Resources

TIF 87 - Management of numbering resources in low density population areas (TELUS Contribution for CSCN)

TIF 88 - (Transitioning from geographic NANP resources to other resources for non traditional purposes) with minutes from the CSCN 100 face-to-face meeting re: TIF 88

TIF 89 - Test Central Office (CO) Codes versus Plant Test Codes dated February 20, 2013.

TIF 90 - Changing the Status of Certain Unassignable CO Codes dated October 8, 2013.

TIF 91 - Elimination of the Split and Boundary Realignment Relief Methods from consideration during Relief Planning in Canada

TIF 92 - N11 Notification Process (28 March 2014).

TIF 93 - Update the Canadian Emergency Service Routing Digit (ESRD) Block Assignment Guideline

TIF 94 - Update Canadian International Mobile Subscription Identity (IMSI) Assignment Guideline to allow for the assignment of MNCs to full MVNOs

TIF 95 - Create a Canadian Carrier Identification Code (CIC) Assignment Guideline to replace the current Canadian Adjunct to the INC Carrier Identification Code (CIC) Assignment Guidelines

TIF 97 - Consider future action on 555 line resources

TIF 98 - Future Canadian Geographic NPA Codes to be used for relief of existing Canadian NPAs forecast to exhaust in next ten years

Tabled TIFs

TIF 27 - SS7 Point Code Industry Notification (See SS7 Point Code Information) 28 June 2013.

TIF 13 - CSCN Review of Number Assignment Guidelines for CNA (CNA)

CNRE011A.doc - 23KB - is a TIF that has been recommended for deletion.

CSCN8818.DOC - 11,264 bytes - Issue 26. Letter to PN96-28 Interested Parties. Invitation to Participate in this SWG.

ISSUE26.DOC - 12,288 bytes - Identification Form - Number Pooling, Code Sharing and Co Code Shortages.

MANDATE.DOC - 11,264 bytes - Issue 26. Mandate and Scope

0 thoughts on “Canadian Central Office Code Assignment Guidelines For Medicaid”

    -->

Leave a Comment

Your email address will not be published. Required fields are marked *