- A+

This is hard to explain but I will try to represent this in a small example:

`NDD = 11/1/2018 `

number of payments:

`1 0 2 0 2 1 1 0 2 1 1 1 `

Since the first month starts with `11`

in NDD then the first element of my list will be `11`

, to compute the next element I take the first month (`11`

) and subtract the first payment `1`

and then the second element is `10`

. This proceeds and the pattern is clear if you follow the logic so I will have

`11 10 10 8 8 6 5 4 4 2 1 12 `

To make it even more clear:

`number_of_payments = [1 0 2 0 2 1 1 0 2 1 1 1] `

Algorithm:

Step 1 - Create an empty list:

`dates = [] `

Step 2 - Append the first month of NDD to the first index of dates

`dates.append(NDD.month) `

Step 3 - Now perform this formula:

`for i in range(1,12): dates[i] = (dates[i-1] + 12 - number_of_payments[i-1]) % 12 `

Step 4 - The final result will be

`dates = [11 10 10 8 8 6 5 4 4 2 1 12] `

Although I was able to do this I need to factor in the years of what NDD started with so what I want to have is **THE RESULT SHOULD BE:**

`11/18 10/18 10/18 8/18 8/18 6/18 5/18 4/18 4/18 2/18 1/18 12/17 `

Now to go with what I have. This is what I have for NDD:

`print(type(NDD)) `

Here is a view values from NDD

`print(NDD[0:3]) 0 2018-08-01 1 2018-07-01 2 2018-11-01 `

Here are the number_of_payments information:

`print(type(number_of_payments)) <class 'list'> `

Here is the first row (same as the example above)

`print(number_of_payments[0]) [ 0. 1. 0. 1. 1. 1. 0. 5. 1. 0. 2. 1.] `

This is what I am trying to do to get the result but it does not work:

`dates = [] for i in range(len(number_of_payments)): dates.append([NDD[i]]) for j in range(1, len(number_of_payments[i])): dates[i].append((dates[i][j-1] + 12 - number_of_payments[i][j-1]) % 12) for date_row in dates: for n, i in enumerate(date_row): if i == 0: date_row[n] = 12 print(dates[0]) `

I get this error:

`--------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-123-907a0962fd65> in <module>() 4 dates.append([NDD[i]]) 5 for j in range(1, len(number_of_payments[i])): ----> 6 dates[i].append((dates[i][j-1] + 12 - number_of_payments[i][j-1]) % 12) 7 for date_row in dates: 8 for n, i in enumerate(date_row): pandas/_libs/tslib.pyx in pandas._libs.tslib._Timestamp.__add__ (pandas/_libs/tslib.c:22331)() ValueError: Cannot add integral value to Timestamp without freq. `

I hope this is clear.

**Entire Code:**

`# In[9]: # Import modules import numpy as np import pandas as pd import datetime as dt from functools import reduce import datetime from dateutil.relativedelta import * # In[10]: # Import data file df = pd.read_csv("Paystring Data.csv") df.head() # In[11]: # Get column data into a list x = list(df) # In[12]: # Append column data into cpi, NDD, and as of dates NDD = df['NDD 8/31'] cpi = df['Contractual PI'] as_of_date = pd.Series(pd.to_datetime(df.columns.str[:8], errors='coerce')) as_of_date = as_of_date[1:13] payment_months = pd.to_datetime(as_of_date, errors = 'coerce').dt.month.tolist() # In[13]: # Get cash flows cf = df.iloc[:,1:13].replace('[^0-9.]', '', regex=True).astype(float) cf = cf.values # In[14]: # Calculate number of payments number_of_payments = [] i = 0 while i < len(cpi): number_of_payments.append(np.round_(cf[:i + 1] / cpi[i])) i = i + 1 # In[15]: # Calculate the new NDD dates # dates = [] # for i in range(len(number_of_payments)): # dates.append([NDD_month[i]]) # for j in range(1, len(number_of_payments[i][0])): # dates[i].append((dates[i][j-1] + 12 - number_of_payments[i][0][j-1]) % 12) # print(dates[0]) d = [] for i in range(len(number_of_payments)): d.append(datetime.datetime.strptime(NDD[i], '%m/%d/%Y')) def calc_payment(previous_payment,i): return previous_payment+relativedelta(months=(-1*i)) dates = [d] for p in number_of_payments: dates += [calc_payment(result[-1],p)] # In[ ]: # Calculate paystring paystring = [] for i in range(len(payment_months)): for j in range(len(dates[i])): if payment_months[i] < dates[i][j]: paystring.append(0) elif NDD_day[j] > 1: paystring.append((payment_months[i] + 12 - dates[i][j]) % 12) else: paystring.append( (payment_months[i] + 12 - dates[i][j]) + 1) % 12) print(paystring[0]) `

I am currently stuck on implementing Arnon Rotem-Gal-Oz solution to adapt to this. Here is also a screen shot of the data frame. Please let me know if more information would help.

**Update:**

I cannot seem to get any good answers since the only person that had a close solution deleted it. I have now posted this to https://www.codementor.io/u/dashboard/my-requests/5p8xirscop?from=active. Paying 100 USD for anyone to give me a complete solution and I mean totally complete not just sort of complete.

`pd.to_datetime`

+ `np.cumsum`

Since you are using Pandas, I recommend you take advantage of the *vectorised* methods available to Pandas / NumPy. In this case, it seems like you wish to subtract the *cumulative sum* of a list from a fixed starting point.

`import pandas as pd import numpy as np NDD = '11/1/2018' date = pd.to_datetime(NDD) number_of_payments = [1, 0, 2, 0, 2, 1, 1, 0, 2, 1, 1, 1] res = date.month - np.cumsum([0] + number_of_payments[:-1]) res[res <= 0] += 12 print(res) array([11, 10, 10, 8, 8, 6, 5, 4, 4, 2, 1, 12], dtype=int32) `

You haven't provided an input dataframe, so it's difficult to determine exactly what help you need to implement the above logic, but it is easily extendable to larger data sets.